If the authorities and the U.S. Postal Service have their way, the hoverboard might be a thing of the past. Thankfully, faulty batteries and the occasional explosion don’t seem to bother Segway or Intel, considering the two companies premiered a robotic assistant during Intel’s forward-thinking keynote address at CES 2016 in Las Vegas.
Intel showcased the extendable, ridable robot — dubbed the Segway Robot, for now — alongside forthcoming consumer drones and a chipset built specifically for wearables, among other things. It rolled onto the stage with an adorable expression that could rival that of a newborn infant, capitalizing on Intel’s RealSense RGB-D camera, which imbues the self-balancing device with a greater sense of spatial depth when tracking and mapping. Intel’s Atom processor makes it all possible, as does the hardware’s GPU acceleration and embedded vision algorithms.
Intel wasn’t alone in the effort, though. The robot represents a partnership between the behemoth tech giant and Xiaomi, the recent acquirer of Segway and the manufacturer of the Ninebot. The collaboration involves a wealth of technology — including voice capabilities, a livestreaming camera, and facial-recognition — which in combination allowed the robot to navigate around Intel’s mock living room and communicate with its inventor like something out of the waste-covered world of WALL-E.
Segway supposedly has plans to make the Segway Robot commercially available sometime next year, but a developer kit is slated to launch in the second half of 2016 at an undisclosed price. The kit will give developers access to its open-source SDK, allowing them to develop new applications for the robot before it’s readily available. Unfortunately, we doubt very many developers will be able to tackle the challenges that accompany a pair of arms that tout the mobility of a Lego figurine. I guess the future will have to wait.
Related Posts
Your WhatsApp voice notes could help screen for early signs of depression
The study, led by researchers in Brazil including Victor H. O. Otani from the Santa Casa de São Paulo School of Medical Sciences, found that their AI could identify depression in female participants with 91.9% accuracy. All the AI needed was a simple recording of the person describing how their week went.
Talk to AI every day? New research says it might signal depression
This finding comes from a national survey of nearly 21,000 U.S. adults conducted in 2025, where participants detailed how often they interacted with generative AI tools and completed standard mental health questionnaires. Within that group, about 10% said they used AI daily, and 5% said they engaged with chatbots multiple times throughout the day. Those daily users showed higher rates of reported depressive symptoms and other negative emotional effects, such as anxiety and irritability.
You might actually be able to buy a Tesla robot in 2027
The comments follow a series of years-long development milestones. Optimus, which was originally unveiled as the Tesla Bot in 2021, has undergone multiple prototype iterations and has already been pressed into service handling simple tasks in Tesla factories. According to Musk, those internal deployments will expand in complexity later this year, helping prepare the robotics platform for broader use.