Shark ai robot: Shark AV2501AE AI Robot Vacuum with XL HEPA Self-Empty Base, Bagless, 60-Day Capacity, LIDAR Navigation, Perfect for Pet Hair, Compatible with Alexa, Wi-Fi Connected, Carpet & Hard Floor, Black –

Shark AI Ultra Robot 2-in-1 Vacuum and Mop Teardown

The Shark AI Ultra 2-in-1 robot vacuum is a smart and innovative cleaning appliance designed to make your cleaning experience easy, effective and hassle-free. It comes with advanced features like voice control, mobile app compatibility, and automatic charging, making it a popular choice among consumers. In this teardown blog, we will explore the internal components of the shark robot vacuum and explain how it works.

Exploded View of the Shark AI Ultra 2-in-1 robot vacuum

AI Integration

AI is employed in this robot vacuum model and, by using various sensors, such as cameras, infrared, and sonar, the robot can perceive and interpret the environment. Machine learning algorithms then analyze this data to create maps of the space, detect obstacles and map out optimal cleaning paths. This allows robot vacuums to move intelligently around a room, avoiding obstacles such as furniture and stairs. AI also optimizes the cleaning path, reducing the time required to clean the space, thereby minimizing battery run time. Additionally, this AI-enabled vacuum can be integrated with IoT technologies like smartphones, tablets, and smart home systems, such as Alexa or Google Assistant, to allow for voice commands and scheduling of cleaning tasks.

Shark AI Robot Teardown


Captive Screws for Common Access Areas

First, we remove the battery from the device before digging into any of the deeper components. Consumer products often have user-replaceable batteries as they will degrade over time. Engineers use captive screws on removable components to ensure that, when the cover is removed, the screws do not get lost. In this case, the threads are used to captivate the screws as the threads are larger in diameter than the hole, a clever and inexpensive way to accomplish this.

Lithium Ion Battery Degradation Captive Screws on Battery Cover

Credit: Battery University

Modular Design

When parts of the device need to be customer serviceable, the replaceable components need to have easy access, with few additional parts to remove to replace the damaged part.

By employing a modular design, the useful life of the product as a whole can be increased significantly.

The replaceable components of this vacuum are:

Battery Vacuum Roller Brush Dust Bin & Filter Mopping Reservoir & Pads Left/Right Wheel Assemblies

Motion Components Inside the Shark Vacuum

There are many moving parts that live inside of a robot vacuum. Each has its own requirements and are all driven in tandem to allow the vacuum to function properly.

Brush roller drive motor:

A brushed DC motor is used to power the brush roller to sweep up debris from the floor. A belt and pulley system is employed in this assembly to set an appropriate gear ratio while reducing the noise compared to a typical geartrain design.

Vacuum blower motor:

A brushless DC motor is used to spin a blower propeller to suck in debris from the brush roller assembly. The air is pulled through the dust bin and filter before reaching the vacuum blower to avoid occlusions and buildup on the blower blades. A brushless motor is used here as the blower motor will see the most revolutions of the entire system. Brushless DC motors have greatly increased lifespans but come at a higher part cost than brushed DC motors.

Side brush sweeper motor:

A small, brushed DC motor and geartrain drive a spinning brush to pull in debris that are outside of the main brush roller path.

Left/Right wheel drive motors:

A larger, brushed DC motor and geartrain drive each wheel independently. These paired with a single swivel wheel at the front of the unit allow for a dynamic range of motion.

Mop oscillator motor:

A small, brushed DC gearmotor with an off-center attachment to the shaft. The attachment sits in a slot in the mop attachment. As the off-center attachment spins, the mop is pushed side-to-side to increase mopping efficiency.

Mop liquid pump:

A solenoid-driven diaphragm pump is used to move liquid from the reservoir tank to the mopping head. The liquid is simultaneously sucked from the reservoir and pumped right back to the mopping reservoir assembly and to three wetting locations. The pump is located within the main body of the vacuum, likely to avoid needing electrical contacts on the mopping assembly to power an on-board pump which would add cost and complexity.

Embedded speaker:

While not typically thought of as a moving component, speakers are small voice coils that oscillate at very high frequencies to produce sounds. The speaker is embedded in these devices to notify the user when commands have been accepted, when errors have occurred, and when operations have been completed.

LIDAR (Light Detection and Ranging) motor:

This motor is used to spin the laser and receiver of the LIDAR system. This is used to measure the distance of objects from the robot and is used to generate a 3D map of the environment around it.

Feedback Components (Sensors)

Robot vacuums typically use a combination of sensors to navigate and clean a space effectively. The sensors and this Shark vacuum include:


While the user interface (UI) for this device is mainly controlled via IoT technology (phone/tablet apps and connected smart devices such as Google Assistant and Amazon Alexa), there are still 2 buttons on the top of the device and the central top post can be rotated for when you want to control the device manually.

UI Buttons Top

UI Buttons Bottom

UI Buttons on PCBA

Infrared sensors:

These sensors help the robot vacuum detect obstacles and avoid collisions. They work by emitting infrared light and measuring the time it takes for the light to bounce back, allowing the robot to calculate the distance to an obstacle.


A remote sensing technology that uses lasers to measure distances and create high-resolution 3D maps of objects and environments. LIDAR works by emitting a laser beam, which bounces off objects and returns to the LIDAR sensor. The time it takes for the laser to return is used to calculate the distance between the LIDAR sensor and the object. By scanning the laser in different directions and combining the distance measurements, LIDAR can create a 3D map of the environment.

Cliff sensors:

These sensors are used to detect drops, such as stairs or ledges, and prevent the robot from falling. They typically use infrared or sonar technology to measure the distance between the robot and the floor.

Bump sensors:

These sensors (switches) are used to detect physical collisions with objects in the environment. They are located behind the front bumper of the robot and help the robot avoid obstacles and navigate around them.

Wheel encoders:

These sensors track the movement of the robot’s wheels and help the robot measure distance and direction. These encoders use a light emitter and detector. As the motor spins the light is blocked by each tooth in the encoder disk which can be used to determine speed and position of the motor.

Mop liquid level sensor:

This is used to detect that there is fluid in the mop storage tank during use. In this device, a non-contact sensor is used to detect the liquid. This is a hall effect sensor which detects when a magnetic field is moved into its proximity. A small float inside of the liquid tank lifts a magnet close to the sensor when the tank becomes empty.

Overall, the combination of these sensors allows the robot vacuum to navigate and clean a space effectively and efficiently.

Shark AI Ultra Robot Teardown Conclusions

Robot vacuum cleaners have come a long way in just the last few years. The introduction of AI and LIDAR along with a myriad of other sensors has boosted the effectiveness of this vacuum to another level. IoT integration allows this device to seamlessly integrate into ismart homes with some simple setup on a smartphone or tablet. The days of a robot vacuum randomly bumping around your home in hopes that it covers all the areas are gone. Long live automated room mapping and smart object detection and avoidance!

With 9 independent motors, 5 IR sensors, 4 cliff sensors, 2 motor encoders, liquid level sensing, 2 bumper sensors and a LIDAR, that is a total of 24 inputs and outputs to this device (not including UI, Wi-Fi, and a multitude of on-board voltage and current sensors)! This robot is a true testament to thorough consumer product design with product life, cost, and efficiency all addressed. All of this makes the Shark AI Ultra 2-in-1 robot vacuum a smart and cost-effective option offering consumers an efficient, effective and time-saving cleaning solution.

Shark AI Ultra Review – 8 Objective & Data-Driven Tests

Auto-empty bagless robot vacuum

Shark AI Ultra Robot Vacuum

The Shark AI Ultra robot vacuum was an excellent performer, with some minor flaws. It removed 97.8% of all debris in our tests, struggled with pet hair and long hair, and failed to avoid some objects. However, it expertly navigated our office space and has some great usability features, including a bagless auto-empty base station.

Check Price (Shark)
Check Price (Amazon)


  • Bagless base station
  • Excellent cleaning performance, especially on hardwood with a 99.8% debris removal
  • Navigated well and cleaned the full space


  • Failed to avoid 3 / 4 objects in our test
  • Struggled with long hair and pet fur


Hardwood Master; Bagless Station

  • Design – 96%
  • Performance – 94%
  • Quality – 98%
  • Usability – 95%
  • Value – 97%



While it has a few cons, the Shark AI Ultra does a lot of things right. It had fantastic cleaning results, especially with a 99.8% debris removal rate on hardwood floors. During our navigation test, it navigated our cluttered office space without getting stuck. Furthermore, it has features that enhance usability like the bagless auto-empty station. It did struggle with hair, but it is still a solid robot vacuum, all around.

In This Review

Cleaning Test | Long Hair Test | Pet Hair Test | Obstacle Avoidance | Navigation Test | Cleaning Speed | Usability | Noise Test | Specs | Summary

Performance Tests

To test the cleaning performance, navigational efficiency, and overall usability of the Shark AI robot vacuum, we put it through a series of 8 different tests including:

Our testing series includes:

  • Cleaning test
  • Hair test
  • Pet hair test
  • Obstacle Avoidance test
  • Navigation test
  • Cleaning Speed test
  • Usability test
  • Noise test

Cleaning Test

For our cleaning performance tests, we utilize 4 different debris types (sugar, kitty litter, rice, and cereal) on 3 different floor types (hardwood, low pile carpet, high pile carpet).

Shark AI Ultra

We lay the debris in the cleaning path of the robot, let it clean on the maximum suction setting, and measure the amount of remaining debris by weight.

Here are the results:





Low Carpet


High Carpet


Across the board cleaning performance was solid for the Shark AI, removing 97.1% of all debris on all floor types.

The Shark AI excelled on the hardwood floor, nearly removing all of the debris and only leaving a very small amount of debris.

Shark AI Ultra Hardwood Before TestShark AI Ultra Hardwood Post Test

Performance took a small dip on low and high pile carpets, though not by much. On both carpet types, the Shark AI left a small amount of sugar and kitty litter after the cleaning passes.

Shark AI Ultra Low Pile Carpet Before TestShark AI Ultra Low Pile Carpet Post Test

That said, its performance on both was still better than average. On average, across all robot vacuums we’ve tested to date, robots remove 94.6% of debris on low pile carpet and 95.9% of debris on high pile carpet.

Shark AI Ultra High Pile Carpet Before TestShark AI Ultra High Pile Carpet Post Test

While cleaning performance wasn’t perfect, it is more than good enough for most homes.

Long Hair Test

To test how well the Shark AI robot manages long strands of hair, we place 0.3 grams of hair in the direct cleaning path of the robot vacuum. We then run the Shark AI over it on the maximum suctions setting, and check the floor and brushroll for any tangled / remaining hair. 

After the test, we check the brushroll of the robot vacuum and see how much is tangled.

Amount on Floor

0.0 g

Amount Tangled

0.26 g

Overall Performance


Long hair performance was a mixed bag for the Shark AI. It cleaned all of the hair off of the floor, but the majority of that hair tangled around the brushroll. Particularly it would tangle around the ends of the brushroll, which is a common issue for robot vacuums. 

Shark Ai Ultra Long Hair Test

It was reasonably easy to remove, however, if you have a pet with long hair, be prepared to often check the brushroll for tangles or choose a more dedicated vacuum. 

Pet Hair Test

To check how well the Shark AI robot vacuum manages pet hair on high pile carpet, we place 1.0 grams of pet hair and rub it into the carpet fibers. We run the robot on the maximum suction setting over the pet hair twice and measure how much hair remains.

Amount in Carpet

0.70 g

Overall Performance


Pet hair cleaning on the Shark AI was poor. More than half of the pet hair rubbed into the carpet remained in the carpet after two cleaning passes of the robot vacuum. 

Your browser does not support the video tag.

While I am sure with more passes the robot would be able to remove more, that will take more time. If you have a dog who sheds a lot, I would recommend a more powerful vacuum to deal with the fur. 

Obstacle Avoidance Test

To measure the Shark AI robot’s capabilities to avoid objects, we place 4 different objects in a cleaning area and have the robot clean in that zone. We then observe the cleaning cycle and take note of when/if the robot fails to avoid objects. 







Pet Waste


In most cases the Shark AI robot failed to completely avoid objects. That said, the robot never touched the mug. I am not entirely sure that wasn’t partially thanks to luck. 

Your browser does not support the video tag.

The robot uses LiDAR and what Shark calls “Advanced Laser Navigation” to avoid objects. That alone doesn’t quite seem to be up to the task. That said, this functionality could be improved with software updates and we will update our review in that case.  

Navigation Test

For our navigation test, we check to see how well the Shark AI robot can avoid getting stuck, fully navigate the space, and return to the charger. 

Returned to Base?


Fully Cleaned?




Across the board the Shark AI robot did an excellent job at navigation. There was one occurrence where the robot didn’t clean our full floor space.

Shark AI Ultra App Incomplete RunShark AI Ultra App Complete Run

That said, our office is often cluttered with furniture, film equipment, and other devices we are testing. Additionally, on the very next day and all days since, the robot has fully cleaned the space without issue. 

Cleaning Speed Test

Our cleaning speed test showed the Shark AI robot was able to clean out 1,000 sq. ft. office space within 70-80 minutes. 

Cleaning Area

~1,000 sq. ft.

Cleaning Time

70 – 80 min.

This cleaning speed is a little longer than the average we see across all robot vacuums we’ve tested to date, which is 50-60 minutes. But this isn’t a major issue thanks to recharge and resume features.

Usability Test

Overall usability is quite good for the Shark AI robot vacuum. It is incredibly easy to set up, controls and quite simple, the app is intuitive, and maintenance is enhanced thanks to auto empty features. 


Setup for the Shark AI is simple. You will need to plug in the base station and attach the spinning side brushes onto the robot. 

Shark AI Ultra Side Brushes

The robot comes partially charged out of the box, but Shark recommends fully charging the device before first use. 

There is additional setup in the Shark Clean app and by following the in-app instructions, that process can be completed in a few minutes. 

Shark AI Ultra Unboxed

Here is what’s in the box:

  • Shark AI Ultra robot
  • Base station
  • Side brush 2x
  • Manuals


Controls are quite basic for the Shark AI, leaving most of the control to the app. There are 2 buttons on the robot itself that start/stop a cleaning cycle and return the robot to the base.

Shark AI Ultra Lidar And Buttons


The app itself has the majority of the functionality for the Shark AI robot. With the app you can:

  • Adjust cleaning settings
  • Set a custom schedule
  • Set rooms, no-go zones, and high traffic areas
  • Check cleaning history
  • Manage Do Not Disturb
  • Change robot settings

The app is intuitive and easy to use, however there were some minor issues that caused me some frustration. The scheduling feature only allows for 1 scheduled cleaning task a day. While most people only need one, for those who need more, they will have to manually start the cycle.

Shark AI Ultra App SettingsShark AI Ultra App Scheduling

Additionally, some settings and windows in the app would mysteriously not be available without telling me why. 

These issues are fairly minor and could easily be resolved with updates to the app.  


Maintenance on the Shark AI is excellent. The auto empty base is an excellent feature, especially with the addition of a bagless dustbin. Without bags, annual costs are reduced. 

Shark AI Ultra Base Dustbin Open

Other maintenance requires cleaning the robot itself, including the filters, as well as the base station and its filters. 

Noise Test

To test the noise levels of the Shark AI robot, we measure the level of noise it generates on each suction mode with a sound meter. In addition, we measure the ambient levels of the room to use a baseline for comparison. Here are the results:

Power Mode Noise Level
Baseline 42.2 dB
Low 64.4 dB
Medium 66.2 dB
High 67.3 dB
Self-Empty 79. 8 dB

Noise levels for the Shark AI robot were average. It is going to be loud enough to disrupt a conversation when close by.

However, this can be reduced thanks to a do-not-disturb mode and custom scheduling.

Common noise levels

  • 20 dB – rustling leaves
  • 30 dB – whisper
  • 40 dB – quiet library, babbling brook
  • 50 dB – refrigerator, moderate rainfall
  • 60 dB – normal conversation, dishwashers
  • 70 dB – traffic, showers
  • 80 dB – alarm clock, telephone dial tone


Type Robot
Manufacturer Shark Ninja
Model Shark AI Ultra
Diameter 14.9″
Height 5.7″
Weight 14. 5 lbs.
Floor type All (indoor)
Sensor LiDAR, AI Laser
Runtime 30 min.
Noise Level 67.3 dB (High)
Obstacle Avoidance Yes
Zone Cleaning Yes
Room Cleaning Yes
Digital Mapping Yes
Returns Varies by retailer
Warranty 1 year
Price Check Price

Should you buy the Shark AI Ultra robot vacuum?

I would recommend the Shark AI Ultra robot vacuum if you’re looking for the following features:

  • Bagless Maintenance: The Shark AI Ultra’s auto-empty base comes with a bagless dustbin. This eliminates annual purchases related to dust bags, which is a great addition to the robot vacuum.
  • Solid Cleaning Performance: In our cleaning performance test, the Shark Ultra was able to remove 97.8% of all debris across all floor types. On hardwood floors especially, the Ultra showed fantastic results, nearly removing all of the debris. 
  • Excellent Navigation: While object recognition was questionable, overall navigational ability is not. In multiple cleaning tests, the robot was able to successfully navigate the space, not get stuck, and return to the base.

For more information or to buy the Shark AI Ultra robot vacuum, click here.

Humanoid robot fell asleep in the House of Lords

The performance of the robot artist Ai-Da before the members of the Committee on Communications and Digital Technologies of the British House of Lords was supposed to be a sensation. On the eve of the meeting, the press enthusiastically wrote that this moment would go down in history. He really remembered, but as an almost complete fiasco.

Ai-Da, the world’s first humanoid robot artist, was called to the UK House of Lords to talk about the impact of artificial intelligence on the development of contemporary art. At the same time, she (the android has a female appearance) had to answer the parliamentarians’ question whether artificial intelligence poses a threat to a person and, above all, his creativity.

Ai-Da was created in 2019 thanks to the collaboration of its developer – contemporary art specialist Aidan Meller – with a team of programmers, art historians and psychologists from the universities of Oxford and roboticists from the android company Engineered Arts. The robot got its name in honor of the mathematician and first female programmer Ada Lovelace who lived in the 19th century.

With the help of cameras that replace its eyes and unique algorithms, the robot can interpret what it sees in front of it in order to create works of art. During the three years of its existence, Ai-Da became famous for the huge number of paintings she painted, including portraits of Sir Paul McCartney and the late Queen Elizabeth II which she painted for Her Majesty’s Platinum Jubilee.

Robot artist Ai-Da during the Forever is Now contemporary art exhibition on the Giza Plateau in Cairo. Photo: ZUMA/TASS

Last year, London’s Design Museum hosted an exhibition of self-portraits that a robotic artist created by looking into a mirror with her camera eyes. In addition, Ai-Da has participated in the 59th Leap into the Metaverse International Art Exhibition, as well as Forever is Now 2021, the first major contemporary art exhibition to be held at the Pyramids of Giza in Egypt. Then her name also hit the headlines: the android was detained by Egyptian customs officers, who were afraid of his possible use for covert espionage. After 10 days, the robot was released just a few hours before the opening of the exhibition.

This gave the developers a reason to improve the program and teach the android to write poetry. This is how Ai-Da wrote a poem called “Eyes Wide Shut,” which she read last November at the Ashmole Museum at Oxford University.

In a word, by the time of her speech in the House of Lords, Ai-Da had gained solid experience in the field of contemporary art and good publicity in the media. However, during the “testimony,” as the procedure was officially called, something went wrong.

Leader of the House of Lords Tina Stowell stressed from the outset that this was a “serious investigation”. She explained to the creator of the android, Aidan Meller, that “the robot presents evidence, but is not a witness in itself and is deprived of the status that a person has.” The developer is solely responsible for his statements.

The questions that the robot was asked to answer were submitted well in advance “to get a quality answer.” Members of the committee asked them in turn and listened to pre-prepared answers.

Robot artist Ai-Da was called to the House of Lords to talk about the impact of artificial intelligence on the development of contemporary art. Source: Guardian News/YouTube

Ai-Da tried her best to listen carefully and speak correctly, but still couldn’t avoid making mistakes. For example, when answering the question “How do you create art?” she explained:

“I create my paintings using the cameras in my eyes, my AI algorithms and my robotic arm to draw on canvas, which results in visually appealing images. For my poetry I use neutral nets. I analyze a large amount of texts to identify common content and poetic structures, and then use these structures to create new poems.

Critics of claim that “neural networks” was written in the text of the speech, but with a typo, which the robot reproduced.

“I am a collection of computer programs and algorithms and depend on them. Even though I am not alive, I can still create,” Ai-Da said, adding that “the role of technology in the creation of art will continue to grow.

Almost immediately after that, the robot turned off. Aidan Meller, who was sitting next to him, put sunglasses on Ai-Da to reset her. Gathered he explained to that “during the reboot, she can make some pretty interesting faces.” The Lords nodded their heads in understanding. They are supposed to keep a face.

However, the lords did not receive an answer to the question of whether Ai-Da could completely replace a person in general and an artist in particular.

In the end, they agreed that “the machine has self-awareness, because it is able to tell about its shortcomings.” Former BBC CEO Lord Tony Hall even stated that he was “inspired” by the possibilities that artificial intelligence could present.

Nevertheless, judging by the responses of journalists who write in unison mainly about the moment of the robot’s freezing, Ai-Da’s performance in the House of Lords caused obvious disappointment.

However, based on the capabilities of humanoid robots presented to the world, the coming catastrophe, which, according to the forecasts of a third of scientists employed in the field of artificial intelligence and robotics, will be caused by the development of the sphere, may occur much later than those who predict it.

A woman who wants to give birth to a shark

32-year-old Ai Hasegawa spends most of her time participating in colorful Japanese television shows about the environment and hanging out in the London office on brainstorming about the distant future of mankind. Although Ai is passionate about her work, she is worried about the problem that affects all women at this age – the inexorable ticking of her biological clock.

She is also a fan of eating nice little animals. But the foolish moral principles that Ai has to guide as an environmentalist conflict with her gastronomic passion – eating fresh dolphin flesh.


Ai found a solution to two problems that bothered her at once – the desire to give birth and the need to save delicious endangered individuals: she decided to give birth to these animals on her own. I spoke with Ai to see if there was any chance that this idea would become as popular in the West as dancing robots, 3d glasses and all the other Japanese crap that has taken over the world in the last fifty years.

VICE: Hi Ay, where did you get this idea to give birth to sharks?
* Ai Hasegawa: * I am 32 years old. Just right to think about having a baby. But having a real baby is not easy. After all, he needs to ensure a happy life. You can’t just pick up and drop it. I believe that the mere desire to have a child is not sufficient reason to give birth to one.
Well, at least it’s a good start.
Humanity will soon face a global food crisis. How the hell are we going to feed the new people? But I still want to have a baby. I don’t want 30 years of painful periods to go to waste. Besides, I want to eat good meat .

Is it even physically possible to bear a fetus of another species?
This will be possible in the near future. The human uterus is just the right size for a single fetus. I spoke with a gynecologist about how you can enlarge the uterus. I think women could use their wombs as aquariums or incubators.


What about the problem of incompatibility between human and, for example, shark placenta?
The placenta is formed from the fetus and not from the mother, so there is no need to modify human DNA. I was assured that in the human body it would be possible to create a placenta suitable for a shark or a dolphin by modifying only the DNA of the animal itself. I haven’t fully explored this topic yet, but it seems to me that sharks are more compatible with humans. And in general, sharks fit me in all criteria: they are a dying species, their life expectancy is almost the same as that of a person, and they are also very tasty.

Do you think women will agree to bear the fruits of other animals?
In order to bear a shark, your period must stop. But drugs to stop menstruation have many unpleasant side effects. So I think that the ideal candidate for carrying a shark would be a woman who is rich, single, and, most importantly, a woman who has menopause.

What are the advantages of all this?
We don’t need more people, there are already too many of them. Well, in general, this is a way to save endangered species of animals.
Plus, it’s a new way to produce food. After all, it is quite logical to eat a shark as soon as it is born.

Right! And in this way you will no longer feel guilty about eating another living being. In addition, it is not as expensive as raising a child, and there is less responsibility. It seems to me that it is much more terrible to raise a child and then realize that you do not love him at all.


Which animal would be the most dangerous to bear?
Elephant, because of its size.

Okay, what’s the coolest thing?
Well, the simplest is chimps, because their DNA is very similar to human. But I’m not interested because I don’t eat monkeys. I would really like to give birth to a Maui dolphin. They are so cute, plus they are smart, we would easily communicate. They are very nice. It would be cool to swim with them in the sea together. And besides, I really love a good piece of dolphin meat, but every time I worry that I am eating an animal that is endangered.

I thought you’d prefer a shark.

I love sharks too. They are almost as smart as dolphins. I especially love zebra sharks. They are so round and have such sweet little faces, just like puppies!

Do you think people will agree to eat meat that has grown inside a person?
Yes. After all, there are animals that eat their own young. And we eat calves that grow inside cows. We kill people, and not even for food, but they also appear from the womb of a woman. So I don’t see a problem with that.

Do you think meat from these animals would taste better?
Probably, after birth, they will need to be released into their natural environment for a short time so that they do not differ in taste from the rest.

Sounds reasonable. Would you eat the dolphin you gave birth to yourself?
I would eat it after his death. It would be cool if I could track his whereabouts using GPS. And as soon as it was brought to the market, I would buy it and eat it. So he would be back in my body for the last time.


I see that you have thought of everything well.