At Tesla’s 2022 annual shareholder meeting this week, the company unveiled a new image to promote its upcoming AI DAY: PART II. Scheduled for September 30th, the event builds on last year’s AI Day event when TeslaBot was first unveiled.
The teaser image shows what is expected to be a TeslaBot prototype hand, and offers a lot of insight into Tesla’s progress on humanoid robots.
Before diving into each of the focus areas in this breakdown, I’d like to touch on my expectations for the TeslaBot demo. When Tesla first unveiled its Teslabot, many questioned strongly whether it would be possible to achieve anything close to the statuary Tesla presented onstage, and later at his cyberrodeo event in Texas. left.
Tesla isn’t running on board when it comes to building humanoid robots, but the reason they’re getting into this space is that they believe they have the necessary puzzle pieces to make it happen. It’s from We’ve had humanoid robots before, but what we’re hoping for from Tesla is a robot that actually resembles a human form, a dramatic step up from toys like Honda’s Asimo and Softbank’s Robotics Pepper. has achieved The TeslaBot should also be a big step forward in terms of design and functionality from Boston Dynamics, considered by many to be the leader in its field.
This new image offers real hope that Tesla is on the road to delivering the previously shown form factor, and we can actually see the inner workings of the arm, wrist and fingers. There’s no consumer-friendly cover for this, but the biggest question isn’t about Tesla’s ability to produce hard plastic with a glossy white paint finish, but to make a functional robot with the dimensions and design it showed on stage. Is it possible?
After seeing this image, my expectations of what we’ll see in the prototype about 8 weeks from now are: Standing, walking, picking up, sorting, and arranging objects based on a computer vision system. It is a working humanoid robot that can work. This may be pretty rudimentary at first, given that the TeslaBot has only been known to the public for a year, but it’s also to be expected that there was development going on at Tesla prior to that date. .
If Tesla hadn’t already had a battery that communicated with a microcontroller, a battery that communicated with the actuators that moved the various appendages, to display images like this and get people excited about AI Day 2, it wouldn’t. I don’t think so. The question is, are the movements seen by the bot pre-programmed, or are they actually already processing dynamic inputs and requests and can respond accordingly?
The amazing thing is to see these multitude of bots working together to solve a task. to point B.
I’m also very impressed when the bot performs a task and dynamically adapts (without collapsing) to objects placed in its way.
I would also like to know more about the battery life. We certainly can’t optimize hardware and software, but these bots need to run for hours between charges to live up to their promise of improved efficiency and run times over humans.
Just in case, here are some images from the TeslaBot announcement.
Now for image decomposition.
In order for the TeslaBot to function, it requires structural strength, typically provided by its bone and muscle structure. In this case, the TeslaBot’s endoskeleton appears to consist of a series of parts, articulated and driven by electronic actuators. We know from his Gigacastings that Tesla can leverage his SpaceX materials science to create new metals, but what we see here is a form of aluminum that has the properties of both strength and light weight. I hope that.
I know the bot will weigh about 56 kg (125 lbs), but I don’t know how many batteries it will need or how they will be distributed. It makes sense to stow them in the torso and possibly the legs to keep the center of gravity as low as possible.
It’s almost a shame to hide all this engineering in the exterior, but this is inevitably wrapped in a protective cover to give the bot a unique look.
Hidden in the fingertips, especially in what appears to be the thumb, appears to be holes that may contain sensors. It works much like an ultrasonic sensor in your car and can determine how far away you are from an object. For bots, I’d like cm precision to be mm precision, which is important for properly handling objects.
Each component of the bot has to operate in a precise symphony of motion, and to enable limbs like fingers and fingertips, Tesla seems to be doing some serious cable management here. and the two joints of the thumb seem to have wires that wrap around the actuators, connecting them to the sensors on the fingertips, as well as communicating with each actuator to tell it what rotation it needs to do.
These hinge-like thumb joints are quite complex, but considering that the bot would be much more useful if the thumb could rotate on two axes (like ours) instead of a simple opening and closing mechanism. , which actually limits the functionality. Place your thumb to manipulate objects.
Gamers might think of this as being able to place and release their fingers on the controller’s thumbsticks, but they also need lateral movement if they want to rotate the thumbsticks.
Having the ability to move in almost any direction at almost any joint sounds very beneficial, but it also comes with longer battery life, so it’s down to Tesla’s software optimizations to take advantage of these movements without waste. Engineers may have to battle between making robots behave more like humans or realizing potential power savings and longer battery life.
It would be really disappointing if the TeslaBot looked like a human but walked and moved like a machine. Why bother to look human in the first place?
If Tesla can launch a more power-dense battery, the 4680, for use in bots, this trade-off is no longer a decision and energy capacity may be available. This raises the issue of TeslaBot’s timeline. This is one I hope we can learn a lot more from looking at the prototype and based on what today’s progress and features have shown, is this going to be a reality in his 1 year or 5 years from him It is clear that later.
The wrist hinge looks strong and sturdy, and should probably support the weight of your wrist and a 20kg (45 lb) load capacity. At first glance, these look like regular cylinders that go through mounting points and are capped with nuts on each end.
If you look closely, the side facing the camera is finished in black, has a hole in the middle, and lacks the usual hex nut. The gray endplates also appear to have cutouts. Presumably this is for weight savings, but it may actually include a rotation limit similar to a steering lock. This means that the actuator will not over-rotate the joint.
Each arm appears to have a box on top. Considering the weight of this part of the arm is difficult to handle, I don’t think this is the local battery he pack. Instead, we believe these are microcontrollers (FSD computers) that manage commands from our brains and can even receive these wirelessly. Given that the sides are polished aluminum, you’ll need some sort of break to allow the wireless signal to flow between components.
With no wires between the main rims going back to the HW3/4 computer, the weight and complexity of wiring is greatly reduced. This is based on the assumption that Tesla can solve the interference and security issues caused by the use of wireless protocols. Since this is being discussed as the future state of the automobile, it could be a great time to consider implementing that system.
The only problem with this theory is the power puzzle. Even if you can communicate wirelessly, you still need wires to support human-level hand, elbow, shoulder, and leg actuators that need to be powered.
Comparing this square box on top of the wrist to the prototype shown shows that the wrist is one of the smallest diameter parts of the body, so it doesn’t look like everything fits under the outer shroud. It’s hard to imagine.
Another explanation is that this is also temporary and something larger will be produced. This is simply the result of our early research. In fact, we expect the TeslaBot displayed on stage to be slightly larger (perhaps 5%) to accommodate the fact that the miniaturization process is unlikely to occur at this stage. . be my approach.
Let us know what you’d like to see from TeslaBot at AI Day 2 next month by leaving a comment below.
Breakdown: What TeslaBot’s Hands Tell Us About Tesla’s Robotic Advances
Source link Breakdown: What TeslaBot’s Hands Tell Us About Tesla’s Robotic Advances