{
  "version": 3,
  "sources": ["ssg:https://framerusercontent.com/modules/GRQPbsSNLbAUn00qGznl/Guay6vnGLtHP8uT8veKs/Y122M6z_Y-3.js"],
  "sourcesContent": ["import{jsx as e,jsxs as t}from\"react/jsx-runtime\";import{Link as o}from\"framer\";import*as a from\"react\";export const richText=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"Keypoint skeletons are a series of points, joints, and shapes that you can predefine apply to an object, and then adjust to meet its pose. You can use them to define the pose and shape of an object, or as a handy way of creating some custom polygons that you can reuse throughout your project. Let's take a look at how you use them in image and video annotation.\"}),/*#__PURE__*/e(\"p\",{children:\"In your class creation modal, you'll notice a skeleton option at the bottom right, we'll select this option and give it a name. In this case, we'll create a human hand. We'll pick a nice color for it. Let's go for teal. And once we pick this skeleton class type, we'll be presented with a skeleton editor window.\"}),/*#__PURE__*/e(\"p\",{children:\"This is where we'll draw our shape and it's quite simple. Just like drawing a regular polygon. We'll click to create some points. In this case, I'll make a series so that I can create a whole human hand. Note that each point is unique and automatically given a name in this case. And number next up, I can choose to connect them.\"}),/*#__PURE__*/e(\"p\",{children:\"In this case, this will create the bones and the skeleton, if you will. Use this link button to connect them with one another. You can connect the point with as many other points as you wish or also leave them alone. And finally, when you're happy with your shape, you can also rename these points. For example, this one here is the thumb tip, so it'll be useful to recognize it when I draw it.\"}),/*#__PURE__*/e(\"p\",{children:\"This one is the wrist and so on. This will be quite useful while you're drawing your skeleton as its shape may morph significantly. Especially in video. You may rename and move these origin points later, but you cannot add new ones. So make sure your schema is correct. When you're done, you can add a description as usual and select a thumbnail.\"}),/*#__PURE__*/e(\"p\",{children:\"In this case, we have a skeleton hand here that I can use to represent it. When you're using the skeleton tool, simply drag the shape like a bounding box and adapt it to your object. In this case, we'll move these joints of a person so it matches their skeleton. Starting with a template like this makes it much faster and easier to adapt objects to a new shape, especially if you have many types of skeletons in a dataset.\"}),/*#__PURE__*/e(\"p\",{children:\"You'll also encounter cases in which your objects are flipped in the vertical axis. In which case, simply draw the box the other way around and the points will be mirrored. And even though it's called a skeleton tool, it can be used for any type of custom polygon. For example, you may have a weird type of cuboid such as this one, or you need to apply a pentagon to represent an object.\"}),/*#__PURE__*/e(\"p\",{children:\"All you have to do is draw it in the skeleton editor and then apply to your images. Now let's take a look at how skeletons behave in video. We'll create a new annotation class. In this case, a right eyebrow. Select the skeleton tool and we'll define five points and connect them together. We'll also define an eyebrow tip so we know it's orientation.\"}),/*#__PURE__*/e(\"p\",{children:\"Keypoint skeletons can interpolate between frames on V7 Darwin, so we'll only have to label a few frames of this video to get a smooth transition that looks realistic and represents ground truth. I am playing and pausing the video here, and every time I pause, I move the whole shape or some of the points.\"}),/*#__PURE__*/e(\"p\",{children:\"This will automatically generate key frames for me, and now I can pass on to the other eyebrow, which I drew the other way around. I'm going to flip this over and start with this left eyebrow. You'll notice whenever I hover over a point to move it, it tells me the name of that point. Those are the names that we defined during the skeleton editing phase.\"}),/*#__PURE__*/e(\"p\",{children:\"You'll also notice that I can move the whole shape if I drag one of the vectors or bones, if you will, which will also generate a new key frame. V7 Darwin allows you to play 4K videos like this one, even when zoomed in, so you can focus on a specific point of action and make sure that your ground truth is accurate.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, \\xa0I'm sure you have better things to do than watch me label the side brow, so I'm going to speed this up significantly. When you feel like you're done, you can drag the play head left and right just to make sure that every frame is sticking to the right points of your object. And now let's take a look at the final result.\"}),/*#__PURE__*/e(\"p\",{children:\"You can continue to pan around and zoom in and out while the video is displaying or toggle display options, for example, brightness, contrast, or the display label that you're seeing over here. These are looking quite good for a few minutes of work. If you're happy with this, you can press the center review button on the top right.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay. One last example, I hope you haven't forgotten about the hand that we drew at the beginning of this video. We're going to use it to define the position and pose of this human hand over here. Otherwise, how else will we teach robots to pet dogs? One day we're going to move these joints to match the position of the this hand, even though it's a bit occluded.\"}),/*#__PURE__*/e(\"p\",{children:\"You'll see how useful the point names are in this example in which there are so many types of joints in this hand, and we have to remember which pose is where between frames. As per the previous example, we'll play and pause the video and move the points to match their new position, after which we'll trace back, play the video again and make sure that they're sticking correctly.\"}),/*#__PURE__*/t(\"p\",{children:[\"And after a couple of minutes of work, we have our results. I'm going to leave you with this. You can try this skeleton editing tool \\xa0now at \",/*#__PURE__*/e(o,{href:\"https://www.v7labs.com/\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"v7labs.com\"})}),\".\"]})]});export const richText1=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"In this session, we dive into V7 slots, which allows you to annotate multiple files on the same screen simultaneously. This feature is especially valuable for medical use cases, for example, mammography hanging protocols, and any scenario that requires multiple files and file types on screen at once. Let's say you need to annotate a dataset with PDF instructions on an item level rather than a dataset level \u2013 that's where slots come in handy.\"}),/*#__PURE__*/e(\"p\",{children:\"Slots offer a seamless way to register PDF files or other additional information alongside the main image. This means annotators always have specific instructions or reference data available while they annotate. For example, you could load an ultrasound image on the right slot and a healthcare record as a PDF on the left slot, allowing annotators to make more informed decisions.\"}),/*#__PURE__*/e(\"p\",{children:\"This video takes you through the process of uploading data into specific slots using the V7 REST API. Although the demonstration involves medical images (DICOM files), you can use the same process for any file types, including images, videos, and PDFs.\"}),/*#__PURE__*/e(\"p\",{children:\"The video covers step-by-step instructions, including setting up the necessary imports, signing and uploading images, and confirming the successful upload. You'll learn how to handle multiple slots for a single item and how to access annotations for individual slots within a Darwin JSON file.\"}),/*#__PURE__*/e(\"p\",{children:\"By the end of this video, you'll have a comprehensive understanding of how to leverage slots Annotation in V7 to enhance your annotation workflow. Whether you're dealing with medical datasets or any other scenario that requires concurrent annotation of multiple files, this feature will boost efficiency and accuracy.\"})]});export const richText2=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"For some annotation tasks, having just one file available on screen just isn't enough.\"}),/*#__PURE__*/e(\"p\",{children:\"Luckily, you can annotate multiple files on the same screen at once, using slots.\"}),/*#__PURE__*/e(\"p\",{children:\"This is especially useful for medical use cases, like mammography hanging protocols. However, Slots can be used for any scenario where multiple files and file types are required on screen at once.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's say you require pdf instructions on an item level rather than a dataset level. Slots enable you to register a pdf file with additional information about the item so the image-specific instructions are always visible to an annotator while they are annotating. In this instance, you would load an ultrasound image in the right slot and a healthcare record as a PDF in the left slot.\"}),/*#__PURE__*/e(\"p\",{children:\"You could also use Slots to render different zoom levels of the same image on the screen at once or to render satellite imagery in one slot and the corresponding ground imagery in the other.\"}),/*#__PURE__*/e(\"p\",{children:\"As you can see, slots can be used in many different ways to suit your specific use case best. However, let's have a look at how to upload data into specific slots using the REST API. We already have a full video on uploading and registering data to V7 - so I won't go too much into detail here.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's look at the code. We'll start with the imports and since we are using the REST API, the only library that we really need is the requests library. I'm also importing my API key here, which I have stored in a separate file so you don't see what my API key is.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, now that we have the imports out of the way, let's look at step zero: which images do we want to upload?\"}),/*#__PURE__*/e(\"p\",{children:\"In this example, I have an extra directory that is called breast where I have two DICOM images, one for the left breast and one for the right breast. So, after getting those two images, I'll store them in a little dictionary and I'll separate them into a key and a value for the item name and for the respective path of that specific item.\"}),/*#__PURE__*/e(\"p\",{children:\"So, when running the cell, we can see again what I just mentioned, the item name itself as the key and the path to that item as the value. This is really handy to then just loop over the individual items which we need to do to upload the data. We will have to upload every single item separately.\"}),/*#__PURE__*/e(\"p\",{children:\"If that doesn't make sense yet, we'll look at it in a second.\"}),/*#__PURE__*/e(\"p\",{children:\"So let's go to step one.\"}),/*#__PURE__*/e(\"p\",{children:\"The first step will be to again register the data. We have to register each single file to V7 so that V7 knows how many files we are going to upload and what they are called. To do that we specify the URL, providing our slugified team name and the dataset slug will be used in a second.\"}),/*#__PURE__*/e(\"p\",{children:\"For the whole payload of the message that we are sending, we are going to have the same headers as always. As for the payload, this is the part that we have to tweak a bit from the payload that we used in the full registry or upload video. When working with Slots we have to provide all the different items in the specific slots.\"}),/*#__PURE__*/e(\"p\",{children:\"Here, we have a list of all items. In this case, we have one item: one breast item that has two images inside of it, or in other words, two Slots for those images. \\xa0Inside this list for all items, I will again be iterating over the actual images that are going to be stored in the specific Slots.\"}),/*#__PURE__*/e(\"p\",{children:\"Here, I'm going to add a list of all Slot items for one particular global item. We'll provide the filename of the actual file that we'll be uploading, in this case, it will be \u201Cbreast left\u201D and \u201Cbreast right\u201D.\"}),/*#__PURE__*/e(\"p\",{children:\"So let's start with \u201Cbreast right\u201D. \\xa0I'm going to provide the Slot name, and the slot name in this case will just be 0 for the first slot, and for the second item, (the \u201Cbreast left\u201D) the slot name will be 1. That's how we associate individual images to the specific Slots of the whole item.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, I hope that wasn't too confusing. As mentioned, it's just a tiny list comprehension iterating over the individual items that you want to assign to one Slot. After we have built this payload, we can just send our request and we'll receive an answer, where we have a list of all the items that we want to upload.\"}),/*#__PURE__*/e(\"p\",{children:\"Again, the \u201Citems\u201D in this example are just going to be one item, namely an item called \u201Cbreast\u201D, but this one item will have multiple Slots, multiple files, and images in the Slots. Not only that - each individual Slot will have an upload ID which means that we'll have to upload each individual image to V7.\"}),/*#__PURE__*/e(\"p\",{children:\"So let me quickly just extract those two upload IDs for the two individual images. I'm going to store them in another dictionary where the key is again the upload ID and the value is going to be the path to the actual item that we want to upload.\"}),/*#__PURE__*/e(\"p\",{children:\"Then we get to the upload loop that iterates over all the individual items. The first step that we need to do in this loop or the second step in this whole upload process is to sign our image. When signing this image, we'll get a response that includes the upload URL to this specific image.\"}),/*#__PURE__*/e(\"p\",{children:\"From here, it's really simple. We'll just need to read in the data, read in the one specific image, and then upload the image to the provided specific upload URL for this one image. Once that is done, we can confirm the upload.\"}),/*#__PURE__*/e(\"p\",{children:\"And, that's pretty much it. If I run this cell, we see that we have two successful responses printed out. Now that\u2019s done, I'll show you that the files are uploaded to the dataset.\"}),/*#__PURE__*/e(\"p\",{children:\"When I refresh the UI, we can see that I now have one item here that is called \u201Cbreast\u201D. If I open this item, we can see that I have two slots with two separate images. One for the \u201Cright breast\u201D and one for the \u201Cleft breast\u201D.\"}),/*#__PURE__*/e(\"p\",{children:\"This way I can compare them better when doing my annotations.\"}),/*#__PURE__*/e(\"p\",{children:\"It\u2019s worth noting that the two files that you want to upload into two slots don't have to be of the same type. One could be a video, for example, and the other a PDF or a normal PNG. To show that this works, I have a second directory prepared with an image and a video.\"}),/*#__PURE__*/e(\"p\",{children:\"Here, I have an image of a floor plan and a video of a person going through the apartment. I will just skip through this code because the code is exactly the same.\"}),/*#__PURE__*/e(\"p\",{children:\"I will get my IDs corresponding to each item itself, and then I will run this sign, upload, and confirm loop. Now, since this is done, we can go ahead and just open the UI, and we can see that I again have a new item right here, which is called \u201Croom\u201D, and if I open this item, we can see that I again have two slots, one for the PDF for the floor plan, or one PNG for the floor plan, and one item for the video of the room.\"}),/*#__PURE__*/e(\"p\",{children:\"I can now iterate over every frame and when wanting to label something based on the room we are in, for example, I can make better decisions based on that because I have the floor plan. I know we are currently in the living room and, for example, when going to the balcony, I would know that this balcony is directly attached to the living room.\"}),/*#__PURE__*/e(\"p\",{children:\"Or when I would be going to one of the bedrooms, I would know that bedroom that I am in is bedroom A, B, C, or however they are be labeled. So, it's just a \\xa0really nice guidance assistance for the annotators to a reference point for example.\"}),/*#__PURE__*/e(\"p\",{children:\"Since we have two images in one item, how do we access the annotations that we do on the individual images on the individual Slots? That's a very good question.\"}),/*#__PURE__*/e(\"p\",{children:\"So, let's simply go ahead and export our breast example where I have done some annotations, and look at the annotation file, our Darwin JSON annotation file. To do that, we'll just select this one file, go to export data, and create a new export. Let's call this one breast as well.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's take the selected file and just export the item. Once it's done generating the export, we can go ahead and download our file. We again have a full video on the Darwin JSON file format where we go into detail about how it is structured.\"}),/*#__PURE__*/e(\"p\",{children:\"To deal with the Slots, let's look at the individual exports.\"}),/*#__PURE__*/e(\"p\",{children:\"Here we have the item field. The item field lists all the metadata of the individual items that we have. In this case, we only have one item, the \u201Cbreast\u201D item, and this has multiple Slots. Here we see a list of all the different slots. We have here slot 0, which was our left slot, which was the right breast, and we here have some metadata like the width and height.\"}),/*#__PURE__*/e(\"p\",{children:\"This is the second Slot, \u201Cslot 1\u201D and its metadata. Now, if we go down to the actual annotations, we have a similar thing: we have a list of all the different annotations of this one item. For example, here we have one bounding box.\"}),/*#__PURE__*/e(\"p\",{children:\"How do we know to which of the two images this bounding box corresponds?\"}),/*#__PURE__*/e(\"p\",{children:\"For this specific annotation, we have the Slot name. That\u2019s how we know that this specific bounding box corresponds to Slot 0 - the left image, so the image of the right breast.\"}),/*#__PURE__*/e(\"p\",{children:\"It\u2019s the same for this bounding box. It's the second bounding box of the left image that we had -and our third bounding box right here belongs to slot number 1 the right image.\"}),/*#__PURE__*/e(\"p\",{children:\"In this example, we uploaded data into two Slots - but you can add as many Slots as you like. For more details, please have a look at the documentation.\"}),/*#__PURE__*/e(\"p\",{children:\"The only option to upload data into Slots is via the REST API, but with this video and the documentation, you are equipped with all the details you need.\"}),/*#__PURE__*/e(\"p\",{children:\"That was it! Slots are a powerful tool to annotate multiple files on the same screen - or to always have a guidance file next to the actual item that is to be annotated.\"}),/*#__PURE__*/e(\"p\",{children:\"You now know how to upload your data into a Slots layout - and you know how to work with the exported annotation that includes multiple slots.\"}),/*#__PURE__*/e(\"p\",{children:\"I hope this video helped you with getting started with V7.\"})]});export const richText3=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"Welcome to V7! In our Academy Series, we teach you how to build AI with V7\u2019s Darwin, a powerful engine that empowers any business to develop state-of-the-art AI models. In this tutorial, we tackle User Roles, from how they work, to the powers that come with each type of role.\"}),/*#__PURE__*/e(\"p\",{children:\"V7 provides five different User Roles: Team Owner, Admin, User, Workforce Manager, and Worker. The Team Owner is the user that first created the team within the platform. As a result, this user has the most permissions, from assigning roles to removing datasets to deleting teams entirely.\"}),/*#__PURE__*/e(\"p\",{children:\"Following Team Owner is the Admin role. This user has all the permissions of a Team Owner, barring the ability to delete the team or change the Team Owner.\"}),/*#__PURE__*/t(\"p\",{children:[\"Next, we have the User. This role has a number of permissions, including the ability to \",/*#__PURE__*/e(o,{href:\"https://www.v7labs.com/dataset-management\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"add datasets\"})}),\" and new users. They can also adjust the permissions of any User, Workforce Manager, and Worker.\"]}),/*#__PURE__*/e(\"p\",{children:\"The Workforce Manager\u2019s permissions are restricted to the datasets that they are added to. Here, they can review, assign, and review tasks - in addition to inviting Workers.\"}),/*#__PURE__*/e(\"p\",{children:\"Finally, Workers (otherwise known as Annotators) are your labeling workforce. They have all the permissions they need to complete their tasks but have restrictions beyond that to protect the integrity of your pipeline.\"}),/*#__PURE__*/e(\"p\",{children:\"In this video, we explore the detail of how these User Roles work, where to find their permissions, how to assign them, and how to use each User Role to create a well-functioning development process. With this video, you\u2019ll learn how to structure your project, and your people, to set you up for AI success.\"}),/*#__PURE__*/t(\"p\",{children:[\"Looking for a little bit of extra detail? Head to our \",/*#__PURE__*/e(o,{href:\"https://docs.v7labs.com/docs/user-roles#user\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"User role documentation\"})}),\" to see a comprehensive breakdown of the powers that come with each role.\"]})]});export const richText4=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"This cog at the bottom left takes you to team settings. Here, you'll find the member tab on the left. Everyone in your team is listed here, from pending invites, to full team members, to annotators with limited access. Use this invite field to add more team members. Enter their email, and press the paper plane, or enter.\"}),/*#__PURE__*/e(\"p\",{children:\"If you've made a mistake and invited the wrong Larry to your team, You can always delete this invite with the trash can button. They will still receive an invite email, but the link will be invalid. If an invite link has expired because too much time has passed, you can always delete an invite and resend it.\"}),/*#__PURE__*/e(\"p\",{children:\"Beside each user, there is a dropdown to their role. They can be annotators. These have very limited visibility of a dataset. They cannot assign images to other users or view any other user in the team. Annotators can request and complete annotation or review work. But they can't really do much else.\"}),/*#__PURE__*/e(\"p\",{children:\"Users are much more powerful. Those are the data scientists or data managers in your team. They can create datasets, add data, generate export versions, export data, they can invite other users, as well as annotators, and they can also annotate and review images. They cannot delete common threads started by admins, and they cannot invite other admins to the team.\"}),/*#__PURE__*/e(\"p\",{children:\"But they can essentially do anything operationally relevant to the job of a data scientist. Admins are just like users who can delete anything they want, invite anyone they want, and make important billing decisions like change a credit card number. They are also the only user class who can make model training requests.\"}),/*#__PURE__*/e(\"p\",{children:\"The last remaining user role is the team owner. This is the user that first created the team and they can assign this role to another admin. They're the only user who can delete the team and the main point of contact for any terms and conditions issues. Finally, to remove someone from your team, Just press the trash button next to their name.\"}),/*#__PURE__*/e(\"p\",{children:\"Don't worry. Removing someone from your team will not delete their account. Any notation they made will stay, but become authorless and any task assigned to them will be unassigned so that someone else may pick it up.\"})]});export const richText5=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/t(\"p\",{children:[\"In this Darwin Fundamentals session, we tackle \",/*#__PURE__*/e(o,{href:\"https://www.v7labs.com/video-annotation\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"Video Annotation\"})}),\" within V7\u2019s Darwin. Whether you\u2019re dealing with long or short videos, countless clips, or individual frames, Darwin is built to streamline the video annotation process for a seamless experience.\"]}),/*#__PURE__*/e(\"p\",{children:\"Video Annotation, historically, can be a painful exercise. However, we\u2019ve ensured that Video Annotation within Darwin is an easy-to-manage, intuitive, and accurate process.\"}),/*#__PURE__*/e(\"p\",{children:\"In this video, we start by revealing the flexibility that comes with video in Darwin, highlighting the various video data types, including drag-and-drag options and CLI commands. We show how users can create their desired frame rate, which determines the frequency of annotation sampling. We also explain how Darwin automatically processes videos to ensure annotations can be performed at the highest resolution - without any image quality loss or frame miscounting.\"}),/*#__PURE__*/e(\"p\",{children:\"Next, we dive into Video Annotation, highlighting the stacked timeline (showcasing annotations and when they occur), frame-by-frame labeling, and interpolation - to smoothly track objects over time. We also explore Keyframes in detail, explaining their importance (such as holding valuable sub-annotation information), and how to use them within your project.\"}),/*#__PURE__*/t(\"p\",{children:[\"You\u2019ll leave this video with a clear understanding of Video Annotation, best practices for annotating within Darwin, and how to leverage the platform to cut Video Annotation time from hours down to minutes. Keen to find out a bit more about Video Annotation for AI? Dive into our \",/*#__PURE__*/e(o,{href:\"https://www.v7labs.com/blog/video-annotation-guide\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"2023 Video Annotation guide\"})}),\".\"]})]});export const richText6=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"Video annotation can be a real pain, but we promise it will be a breeze on V7's Darwin platform. Let's jump right in. You can load any form of video data, long videos, short videos, multiple videos, drag and drop them, or use the CLI commands below. Darwin will prompt you to choose a frame rate to sample from.\"}),/*#__PURE__*/e(\"p\",{children:\"This will determine how many frames you want to label and how frequently, so if you're working with object tracking or in most videos, choose something like 15 fps or higher. Sometimes, though, you might want to choose something very low. For example, in general, object detection, you could pick one frame every few seconds, and sample images with a much larger variance.\"}),/*#__PURE__*/e(\"p\",{children:\"By default, Darwin will assume you want to label a video. But you can also choose to label individual frames. These will be treated like images, and the video will become something like a folder instead. But for this tutorial, we'll focus on video. When uploaded, your video will take a few minutes to process.\"}),/*#__PURE__*/e(\"p\",{children:\"Darwin will make sure it can be annotated at the highest possible resolution, that there is no video compression affecting the image quality, and that no frames are dropped or miscounted. This all happens automatically in the background, without you having to worry about decoding or preprocessing any of your image data.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, we're now at the annotation interface. In the top right, you'll see the annotation list that you're familiar with. At the bottom, you'll also see a stacked timeline that shows the annotations and when they occur. Here in the left are navigation controls. You can press the arrow keys to move in between frames or click on these buttons.\"}),/*#__PURE__*/e(\"p\",{children:\"Video frames are intelligently preloaded and pre processed, so you'll never have to wait for buffering. You can also click and drag the playhead to scrub through the video and play it at whatever speed you wish by moving back and forth. And whenever you pause, Darwin will load a full resolution version of that frame.\"}),/*#__PURE__*/e(\"p\",{children:\"You can click on keyframes to jump to them or anywhere on the timeline to skip to a specific location. We'll get to what keyframes are in a moment. Darwin's video annotation works much like video editing software. Objects appear as events in a moving interactive timeline at the bottom, and you can manipulate them by modifying the annotations, interpolating any changes, or moving them through time.\"}),/*#__PURE__*/e(\"p\",{children:\"Every type of annotation within V7 Darwin interpolates, including polygons. But you can choose to switch interpolation on and off at any point. Whenever you apply a change to a label, you automatically generate a keyframe in that point in time. Keyframe indicates that there has been a change to a label, and you can interpolate in between them.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, enough theory, let's get down to making our first label. We'll see the front of this car appear on the edge of the screen, and make a bounding box around it. We can press the right arrow key to advance the frames a little bit, and adjust it to its new position. In this case, we're labeling frame by frame, as this vehicle is changing a lot between them.\"}),/*#__PURE__*/e(\"p\",{children:\"However, once it's in our shot, we can start interpolating in between frames and simply drag our bounding box across the screen. We'll play the video and arbitrarily stop it at a point in which we want to move our box. As soon as we pick it up and move it, we will automatically generate a keyframe on this point that indicates that the box is supposed to be here at this time.\"}),/*#__PURE__*/e(\"p\",{children:\"We can then backtrace a little bit and make sure that the box sticks correctly to the car. Okay, it looks like we need to extend the duration of this label. You can select the label and click and drag its extremities to extend its duration. We'll make sure it lasts for the entire time the car is on screen.\"}),/*#__PURE__*/e(\"p\",{children:\"You can also obviously reduce the duration of a clip or move it around in time. If you do so, the position of frames will be saved, so they continue to represent the point in time in which they were originally created. You can see where these keyframes are by the black diamonds on the annotations. If you need to delete one or create one, you can access the context menu by right clicking on any annotation.\"}),/*#__PURE__*/e(\"p\",{children:\"Sub annotation changes also generate keyframes. These are quite useful for attributes. For example, this car may be a sports car for the whole duration of the clip, but it may be turning only for a few frames. You can generate keyframes that hold sub annotation information like attributes, instance IDs, directional vectors, and so on.\"}),/*#__PURE__*/e(\"p\",{children:\"And change them throughout the video. Let's complete our car chase over here so I can show you more examples. On the bottom right, you have a keyframe and interpolation control. If this diamond is red, your playhead is on a keyframe. You can click it to delete that keyframe as well. You can also use it to generate empty keyframes.\"}),/*#__PURE__*/e(\"p\",{children:\"These are useful to indicate that an annotation should not be moving at all. There is essentially no information in a specific keyframe. As with regular image annotation, you can also hide other annotations or sub annotation information to make sure you've labeled your object correctly. You can zoom in to any point of the video and drag it around while it plays, and watch hundreds of annotations move in real time, even on 4K videos like this one.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, one more feature set. Let's look at polygon interpolation. You can switch on interpolation at any point for polygons and move their respective points to match a new shape. Darwin will not just interpolate polygons with the same number of points, but even ones that have different numbers between the start and end of a video.\"}),/*#__PURE__*/e(\"p\",{children:\"This requires some pretty complex engineering work to work smoothly and in real time, so we're pretty proud of this achievement. We've also made sure it works with polygons with hundreds of vertices. And finally, let's see how AI assisted labeling can work in video. Here we're using Darwin's auto analytic functionality to generate these pixel perfect masks of this person walking down a hallway.\"}),/*#__PURE__*/e(\"p\",{children:\"Rather than having to redraw this detailed silhouette every frame, all we have to do is press the right arrow key and rerun the neural network to resegment him. And as you can see, it takes less than 2 seconds per frame to generate a pixel perfect instant segmentation with no prior training data at all.\"}),/*#__PURE__*/e(\"p\",{children:\"I'll speed up the rest of the sequence. Correction markers are also carried over to new frames that are auto annotated so that knowledge of the object is preserved. And all of this will allow you to label entire videos in minutes rather than hours. The sequence, for example, of this man walking took about two minutes to complete, start to finish.\"}),/*#__PURE__*/t(\"p\",{children:[\"Take a look at how detailed its segmentation looks. and how accurate the minute changes are in between frames. Our team at V7 is very proud of our video annotation features, and we hope you'll enjoy it too. You can try all of this now at \",/*#__PURE__*/e(o,{href:\"https://www.v7labs.com/\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"v7labs.com\"})}),\".\"]})]});export const richText7=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"In this video session, we explore the Webhook stage in V7 workflows, a powerful feature that enables real-time communication between V7 and other applications. This guide will walk you through the concept of webhooks, their functionality, and how to integrate them into your workflow. You'll discover how to automate tasks using user-friendly solutions like Zapier and advanced developer solutions like AWS Object Lambda.\"}),/*#__PURE__*/e(\"p\",{children:\"Webhooks are your go-to solution for a wide range of needs, whether it's sending notifications, processing Darwin JSON files in external applications, or filtering datasets. They let you trigger specific actions, such as sending Slack notifications based on custom tags or other conditions, giving you complete control over your machine learning processes.\"}),/*#__PURE__*/e(\"p\",{children:\"To assist you in getting started, we provide a detailed, step-by-step walkthrough on creating a new workflow, adding the webhook stage, and seamlessly connecting it with other stages. Additionally, the video demonstrates how to test the webhook stage to ensure flawless functionality. You'll also find practical examples of leveraging webhooks with external tools for tasks like counting annotations. Moreover, since you can use Darwin JSON as a webhook payload, in some cases, you can even run inference on external applications and use them similarly to the BYOM feature.\"}),/*#__PURE__*/e(\"p\",{children:\"By mastering the art of setting up and utilizing webhooks, you'll be able to create complex workflows tailored to your specific needs, all without the need for coding.\"}),/*#__PURE__*/t(\"p\",{children:[\"Webhook documentation: \",/*#__PURE__*/e(o,{href:\"https://docs.v7labs.com/reference/webhooks\",nodeId:\"Y122M6z_Y\",openInNewTab:!1,smoothScroll:!1,children:/*#__PURE__*/e(\"a\",{children:\"https://docs.v7labs.com/reference/webhooks\"})})]})]});export const richText8=/*#__PURE__*/t(a.Fragment,{children:[/*#__PURE__*/e(\"p\",{children:\"With V7's workflow logic, you can already design simple or very sophisticated automation for your annotation pipeline. However, if you want to go one step further and include custom automation using no-code solutions like Zapier and developer solutions like AWS Object Lambda, you can include one specific stage into your workflow - the Webhook stage. But first, what are Webhooks?\"}),/*#__PURE__*/e(\"p\",{children:\"Webhooks are a way for applications to communicate with each other in real-time. One way of communication is that App 2 asks App 1 for some data, and App 1 sends the data if it happens to have what App 2 wants.\"}),/*#__PURE__*/e(\"p\",{children:\"Webhooks, on the other hand, are automated messages sent from apps when something happens.\"}),/*#__PURE__*/e(\"p\",{children:\"Those messages or \u201Cpayloads\u201D can be anything that the receiving end (or App) is expecting and knows how to work with. In this case, App 1 simply needs to know where to send the message, or in other words - it needs to know a unique URL. Once the payload reaches the receiving app, it can do whatever it wants with it, like sending you a notification or doing some computation with the information provided.\"}),/*#__PURE__*/e(\"p\",{children:\"In the case of the V7 Webhook stage, every time an image or item reaches that stage, It sends out a message to the specified URL where the payload is then processed. When adding a Webhook stage, you will of course need to provide this URL - and if your app needs some sort of authorization, you can add that right here and it will be included in the header of the message.\"}),/*#__PURE__*/e(\"p\",{children:\"The most important decision you need to make at this point is to include the annotation data or not. If you don't, the payload will simply include the metadata of the image. If you include the annotation data, the message that is sent out will then look similar to this one right here. It includes data following the standard Darwin JSON annotation format.\"}),/*#__PURE__*/e(\"p\",{children:\"For a detailed overview of this format, feel free to have a look at the full video that describes it or at the documentation. Links are in the description.\"}),/*#__PURE__*/e(\"p\",{children:\"Now, with all this information, you can very easily access the data you need for your custom logic. The payload includes metadata of the item, and, as mentioned, if selected, the annotation data including information on the annotators, tags, and more.\"}),/*#__PURE__*/e(\"p\",{children:\"For instance, here have a line annotation, with the label Traffic Boundary Line, created by Kevin Chang. We then also have a bounding box for a vehicle. Using the Webhook stage that sends out this information, you can, for example, easily use Zapier to search for certain tags or write custom functions in AWS Object Lambda to do more complex computations.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's have one more detailed look at how the data would flow through a workflow. Let's start by creating a new workflow. Now, with this simple workflow, let's go ahead and add our Webhook stage. To connect the Webhook stage to our workflow, let's go ahead and connect that first to our annotation stage, and then to our review stage.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's also go ahead and connect the failed output to the review stage because, in the end, it's still better to forward your item to the next stage, even after a failed attempt of sending yourself a Slack notification using Zapier for example, than moving it back one stage.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, let's go ahead and finish our Webhook setup. We need to connect our URL, as already discussed, and in this case, I'll be using the Webhook site for this little demo. In your case, if you need authorization, you can here add your key for that.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, so let's have a brief look at how the data would flow through this workflow. In the beginning, we have our data in this dataset stage, if we had connected a dataset.\"}),/*#__PURE__*/e(\"p\",{children:\"If we then proceed to the next stage, our data or images, for example, would go into the annotation stage. We here can add labels and tags, and once we proceed into the next stage, our images will come to the Webhook stage. Here, our images will automatically be sent to the URL that we have provided. On the receiving end, our app would execute our custom logic and we would be happy.\"}),/*#__PURE__*/e(\"p\",{children:\"After our app has successfully, or in this case, also unsuccessfully, processed our message it will be forwarded to the review stage.\"}),/*#__PURE__*/e(\"p\",{children:\"Now to see if our webhooks page works correctly with the URL that we've provided, you can actually use this test button right here, which is very useful to see if it works. So if we click this button here, we can see that we have executed a test for the Webhook stage, and that it has passed successfully.\"}),/*#__PURE__*/e(\"p\",{children:\"Okay, continuing, we would now be in the review stage. Here, we have some reviewers who can then accept all annotations, which will then proceed the items to the complete stage, or they can reject the annotations, which would then forward the items back to the annotation stage.\"}),/*#__PURE__*/e(\"p\",{children:\"One would reject some annotations if they are poorly done, or if there are some annotations missing.\"}),/*#__PURE__*/e(\"p\",{children:\"Once the items are back in the annotation stage, annotators can redo the annotations or add missing annotations, and forward them back to the webhook stage, allowing the loop to continue until all items have passed successfully and are accepted.\"}),/*#__PURE__*/e(\"p\",{children:\"With those Webhooks, you can implement any logic that you need for your specific case.\"}),/*#__PURE__*/e(\"p\",{children:\"For example, let's say you want to compute the area of an annotation or just count the number of pixels. You can then write a custom function in AWS Object Lambda that then adds a tag to the image, for example, requires specific review by specific people, for example, or you can just send a Slack message using Zapier.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's say during the annotation stage, an annotator has added a tag stating \u201Curgent\u201D or \u201Cdifficult\u201D. This is then forwarded to the Webhook stage and, using Zapier, you can filter through tags, and if those tags exist on an image you then again get a Slack notification.\"}),/*#__PURE__*/e(\"p\",{children:\"One more example could be that you want to count the number of annotations or labels.\"}),/*#__PURE__*/e(\"p\",{children:\"Let's say you have a specific product that has at least 10 screws but there are only 8 annotations. You can then just have your custom function in AWS Object Lambda that just adds a tag \u201Cmissing annotations\u201D.\"}),/*#__PURE__*/e(\"p\",{children:\"You can see that Webhooks are a very versatile, yet simple and easy-to-integrate concept that can be very powerful.\"}),/*#__PURE__*/e(\"p\",{children:\"That was it. I hope this video helped you with getting started with V7.\"})]});\nexport const __FramerMetadata__ = {\"exports\":{\"richText5\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText7\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText8\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText2\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText6\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText3\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText1\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"richText4\":{\"type\":\"variable\",\"annotations\":{\"framerContractVersion\":\"1\"}},\"__FramerMetadata__\":{\"type\":\"variable\"}}}"],
  "mappings": "2JAA+G,IAAMA,EAAsBC,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,4WAA4W,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0TAA0T,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2UAA2U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4YAA4Y,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4VAA4V,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,yaAAya,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qYAAqY,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,gWAAgW,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oTAAoT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qWAAqW,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,8TAA8T,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6UAA6U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+UAA+U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,8WAA8W,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+XAA+X,CAAC,EAAeF,EAAE,IAAI,CAAC,SAAS,CAAC,mJAAgKE,EAAEC,EAAE,CAAC,KAAK,0BAA0B,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,YAAY,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,EAAeE,EAAuBJ,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,ocAA+b,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+XAA+X,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,8PAA8P,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uSAAuS,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+TAA+T,CAAC,CAAC,CAAC,CAAC,EAAeG,EAAuBL,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,wFAAwF,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mFAAmF,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sMAAsM,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oYAAoY,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,gMAAgM,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wSAAwS,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,yQAAyQ,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,gHAAgH,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qVAAqV,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0SAA0S,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+DAA+D,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0BAA0B,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,gSAAgS,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2UAA2U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6SAA6S,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uOAAmN,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4TAAwS,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,8TAA8T,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2UAAuT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wPAAwP,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qSAAqS,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qOAAqO,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2LAAsL,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,kQAAoO,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+DAA+D,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oRAA+Q,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qKAAqK,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,obAA0a,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2VAA2V,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sPAAsP,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,kKAAkK,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4RAA4R,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mPAAmP,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+DAA+D,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4XAAkX,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oPAA0O,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0EAA0E,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wLAAmL,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uLAAkL,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0JAA0J,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2JAA2J,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2KAA2K,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,gJAAgJ,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4DAA4D,CAAC,CAAC,CAAC,CAAC,EAAeI,EAAuBN,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,2RAAsR,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mSAAmS,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6JAA6J,CAAC,EAAeF,EAAE,IAAI,CAAC,SAAS,CAAC,2FAAwGE,EAAEC,EAAE,CAAC,KAAK,4CAA4C,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,cAAc,CAAC,CAAC,CAAC,EAAE,kGAAkG,CAAC,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oLAA+K,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4NAA4N,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0TAAqT,CAAC,EAAeF,EAAE,IAAI,CAAC,SAAS,CAAC,yDAAsEE,EAAEC,EAAE,CAAC,KAAK,+CAA+C,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,yBAAyB,CAAC,CAAC,CAAC,EAAE,2EAA2E,CAAC,CAAC,CAAC,CAAC,CAAC,EAAeK,EAAuBP,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,oUAAoU,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uTAAuT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+SAA+S,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+WAA+W,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mUAAmU,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0VAA0V,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2NAA2N,CAAC,CAAC,CAAC,CAAC,EAAeM,EAAuBR,EAAIC,EAAS,CAAC,SAAS,CAAcD,EAAE,IAAI,CAAC,SAAS,CAAC,kDAA+DE,EAAEC,EAAE,CAAC,KAAK,0CAA0C,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,kBAAkB,CAAC,CAAC,CAAC,EAAE,+MAAqM,CAAC,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mLAA8K,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,odAAod,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,yWAAyW,CAAC,EAAeF,EAAE,IAAI,CAAC,SAAS,CAAC,gSAAwSE,EAAEC,EAAE,CAAC,KAAK,qDAAqD,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,6BAA6B,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,EAAeO,EAAuBT,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,yTAAyT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sXAAsX,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wTAAwT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mUAAmU,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wVAAwV,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,gUAAgU,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,kZAAkZ,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2VAA2V,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0WAA0W,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,2XAA2X,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qTAAqT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0ZAA0Z,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,kVAAkV,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,8UAA8U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,ocAAoc,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6UAA6U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+YAA+Y,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,kTAAkT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,8VAA8V,CAAC,EAAeF,EAAE,IAAI,CAAC,SAAS,CAAC,iPAA8PE,EAAEC,EAAE,CAAC,KAAK,0BAA0B,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,YAAY,CAAC,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,EAAeQ,EAAuBV,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,uaAAua,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sWAAsW,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+jBAA+jB,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,yKAAyK,CAAC,EAAeF,EAAE,IAAI,CAAC,SAAS,CAAC,0BAAuCE,EAAEC,EAAE,CAAC,KAAK,6CAA6C,OAAO,YAAY,aAAa,GAAG,aAAa,GAAG,SAAsBD,EAAE,IAAI,CAAC,SAAS,4CAA4C,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,EAAeS,EAAuBX,EAAIC,EAAS,CAAC,SAAS,CAAcC,EAAE,IAAI,CAAC,SAAS,+XAA+X,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oNAAoN,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4FAA4F,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,kaAAwZ,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sXAAsX,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sWAAsW,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6JAA6J,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6PAA6P,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sWAAsW,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,+UAA+U,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,oRAAoR,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,0PAA0P,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,6KAA6K,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mYAAmY,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uIAAuI,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mTAAmT,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wRAAwR,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,sGAAsG,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uPAAuP,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,wFAAwF,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,iUAAiU,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,mSAA+Q,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,uFAAuF,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,4NAAkN,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,qHAAqH,CAAC,EAAeA,EAAE,IAAI,CAAC,SAAS,yEAAyE,CAAC,CAAC,CAAC,CAAC,EACtv0CU,EAAqB,CAAC,QAAU,CAAC,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,SAAW,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,UAAY,CAAC,KAAO,WAAW,YAAc,CAAC,sBAAwB,GAAG,CAAC,EAAE,mBAAqB,CAAC,KAAO,UAAU,CAAC,CAAC",
  "names": ["richText", "u", "x", "p", "Link", "richText1", "richText2", "richText3", "richText4", "richText5", "richText6", "richText7", "richText8", "__FramerMetadata__"]
}
