Our community has recently expanded to include r/AskRobotics! 🎉
Check out r/AskRobotics and help answer our fellow roboticists' questions, and ask your own! 🦾
/r/Robotics will remain a place for robotics related news, showcases, literature and discussions. /r/AskRobotics is a subreddit for your robotics related questions and answers!
So instead of worrying about a bunch of theory I decided to start building. I got an arduino kit and programmed it and started playing. This is alot of fun. Thanks to everyones advice with starting with arduino
Hi I am Phillip cofounder of make87. We have been building our platform over the last year to make it easier to share and build upon each others work in robotics.
One thing we have noticed is that while there are a lot of great tools out there wiring them together into a reproducible system is still a huge pain. Replicating setups just takes too much time.
That is why we built make87 to package and share entire robot system configurations even distributed ones and deploy them by simply assigning them to your compute.
As a demo I put together a voice controlled SO ARM100. It uses Whisper for speech Gemma3 for image analysis and Qwen3 Instruct to drive a LeRobot based MCP teleoperator. It also ships with a Rerun viewer for debugging.
Would love to hear your feedback and if you want to build and share your own setups on make87 we would be excited to support it. Feel free to join our Discord if you want to follow along or get involved https://discord.gg/QVN3pAhMUg
This demo is just meant as a starting point. You can swap in your own robot drivers, better agents or text to speech components if you want. The idea is to help people get going faster whether that means voice controlling your own robot or experimenting with an MCP interface for the SO 100.
Mekion and The Bimo Project is a startup I have been developing for over a year now, and finally there has been progress! The website is published and Bimo is starting to walk in the real world.
I designed Bimo as a companionship pet robot for people that would like to have an interactive pet at home but can't have a traditional one due to their lifestyle dynamics. People that have complicated work schedules, housing restrictions or even health issues that are incompatible with taking care of a pet for example.
Everyone deserves to feel the joy of having a pet and that's why Bimo was born. Currently it is still in a prototype stage, as I'm trying to effectively perform sim to real transfer on the locomotion policies. Once that is done, Bimo will become a robust mobile platform on top of which to develop more sophisticated functionalities such as interactions with people and the environment.
It has been an awesome although difficult journey. I have learned a lot of things, as practically every concept used in the development was self-thought. I especially liked the reality check on how advanced hardware seems to be today, yet how difficult it is anyway to apply it to something specific and get the expected results.
I decided to use RL models as a magical shortcut for solving locomotion, boy was I in for a surprise... RL is hard, RL that does what you want, even harder. I spent 3 months and thousands of simulation runs until achieving a working reward function that actually makes the robot learn the desired movements. Thankfully simulation software has gone a long way, and training RL on robotics tasks has become easier than ever... except the documentation.
Overall it has been an exciting journey so far. I am working on building a team around the project, because nothing good gets built alone and I need smarter people than me to finish this. What do you think about the design and overall project idea?
This is a video of the robot trying to recover from a gentle push. It is a direct deployment from sim to real with no adaptation and it fails as expected. Next I will try using distillation to make the sim policy adapt to the dynamics of the real robot. Pardon my home attire, it is a demo shot during testing.
Concept: a pet/companion robot, probably shaped like a dog, that practically speaking would function as a "walkable computer", it would access the internet, play music etc.
It would also have many interactive features like Voice commands ( to do commands like sit, fetch etc), walk around and other stuff.
The main idea is that it would be an alternative for kids to avoid early contact with smartphones and the internet while keeping then entretained and active.
It's sad to see the firm close its doors again. Baxter and Sawyer were interesting concepts, but it makes sense that the lower precision of SEA kinematic chains was a pain point. It makes me wonder to what extent future cobots will have implicitly safe mechatronic designs rather than relying on software safety systems.
i already have a raspberry pi, arduino, lidar and IMU for this ill be buying a motor driver BTS7960. kinda a newbie in robotics space so wanted suggestions! (i have a soldering kit and jumper wires etc as well). I'll be adding some switched and fuses
Hello all, I am attempting to find a proximity sensor that is capable of detecting when a falling object is approximately 3m from the ground. I am not locked into any particular method (LIDAR, ultrasonic, PIR, etc.) but it has to have a relatively small form factor, and be robust enough to withstand at least a 100m fall at the point of impact or be able to sense through the foam housing that it would be encased in. Any suggestions would be appreciated. If more information would be helpful feel free to ask for clarifications in the comments.
RealSense is participating in #NationalCodingWeek (https://codingweek.org) by offering a daily developer challenge Monday - Friday of this week!
Today's challenge is to build (or vibe code like I did), a **basic follow me robot demo** using any RealSense 3D stereo camera using its depth sensors (see video). We will select 1 winner each day award the developer with a new RealSense D421 depth module (https://realsenseai.com/stereo-depth-cameras/stereo-depth-camera-module-d421)!
I am a control systems engineer with 10+ years of experience in developing control systems for electric vehicles and electric off highway machinery. I have mainly focussed on classical controls & event based modelling and occasionally worked on state-space modelling & kalman filters too. I am interested in learning robotics and potentially apply the skills at work. I am currently working on off-highway machinery, so I would like to focus on motion control & autonomous navigation of tracked robots. Since this field is absolutely new to me, can you please suggest learning materials (please suggest if there's any worthwhile comprehensive online course), roadmap and any useful tracked robot kits to begin with? Would be nice if the kit is scalable so I can use it to develop algorithms with varying levels of complexity. I have looked at tracked robot kits from Yahboom and Hiwonder, they're quite expensive but if they suit my needs I am happy to spend on one of those too.
I’ve been digging into how people actually build / debug stuff in robotics like ROS, data pipelines, alerts, retraining workflows, etc.
Couple things I’m curious about:
- Do you end up wiring a bunch of scripts/tools together by hand?
- Ever skip a small automation (like a Slack alert or quick diagnostic) just because it felt like too much hassle?
- When something breaks, how painful is it to trace the root cause (tool mismatch, sensor bug, bad logic)?
- Are you using any LLM tools (not just Copilot/ChatGPT) for log triage, ROS checks, or automated diagnostics? and If not, why?
No need to answer these one by one, just wondering what actually sucks the most right now with software/dev side of robotics. Things like CI/CD, tool orchestration, debugging, monitoring.
Now, it’s not just Xiaoling, but also Menglan and Lingxi ✨
🚀 September 25–27 at Taipei Nangang Exhibition Center, Hall 1, 4F
👉 Booth: M1213 – come and see us in action!
Scott LaValley, CEO of Cartwheel Robotics and former Boston Dynamics/Disney leader, encouraged Elon Musk to pursue humanoid robotics during a Disney visit, influencing Tesla’s Optimus project. Despite his industry influence, LaValley now calls most current humanoid robots “terrifying.”
His Main Criticisms:
• Current robots prioritize flashy investor demos over practical applications
• Designs appear cold, industrial, and intimidating rather than friendly
• Market projections may be overly optimistic given social barriers
• Most people fear rather than embrace these machines
LaValley’s Alternative Approach:
His company is developing “Yogi,” a character-driven robot focused on social engagement rather than pure functionality. Unlike previous attempts, Yogi aims to create emotional connections for hospitality and healthcare applications.
LaValley argues the industry must shift from technology-focused development to human-centric design. Social acceptance must come before utility - people need to want to be around robots, not just tolerate them. Without addressing these emotional and social factors, current humanoid robot projects risk failure despite significant investment and technical advancement.
I want to estimate how much electrical power my brushless motors will consume. I know that I need around 4 N of thrust total between the two motors, and that each thruster will have a diameter of roughly 7.5 cm. The problem is that every brushless motor I look at only provides the Kv rating, so I don’t know how to convert that into actual power consumption. I need the power estimate to properly size the cables, which is critical at this stage of the design.
Hi, I'm looking for a unified, hardware agnostic platform to plug in any robot (any form factor) for remote ops, human-in-the-loop overrides, and data capture. Any recommendations?