Carnegie Robotics and Mine Vision Systems Partner to Deliver Smart Mining Solutions

 Carnegie Robotics and Mine Vision Systems Partner to Deliver Smart Mining Solutions

Carnegie Robotics, LLC (CRL), a pioneer in rugged robotic systems, and Mine Visions Systems (MVS), a leader in 3D vision and mapping software, are partnering to provide innovative solutions that will improve safety, increase productivity and advance the long-term sustainability of the mining sector. Under this agreement, MVS’s modeling software will join with CRL’s line of MultiSense cameras to deliver a robust, computer vision-based mapping tool for mining operations that is faster and more accurate than any currently on the market.

The industry vision of sustainability through a fully automated mine becomes closer to reality with smart mining solutions. The MVS-MultiSense solution supports the industry vision by providing more intelligent and productive equipment through CRL’s high-data-rate, high-accuracy 3D range sensors with MVS’ geological and geotechnical monitoring software. It produces high-fidelity 3D models with high- resolution 2D images of the mine within hours instead of weeks, giving operators near-real-time information for mine management. It supports the need for fleet tracking throughout the mining operation allowing for more increased equipment performance and efficiency. The MVS-MultiSense tool delivers unprecedented geotechnical insight through accurate mapping and modeling of the mining environment while providing a more thorough understanding of the working environment to increase safety and decrease exposure to life-threatening obstacles.

Read More

Northrop Grumman and U.S. Navy Complete CDR for EOD Robotic System

Northrop Grumman Corporation has announced that, in conjunction with the U.S. Navy, has successfully completed the Critical Design Review (CDR) for Increment one of the Advanced Explosive Ordnance Disposal Robotic System (AEODRS) program.

Carnegie Robotics is proud to be part of the Northrop Grumman team that is designing and producing this advanced EOD robot for the U.S. Navy.  

Carnegie Robotics's Power Capability Module (CM):

  • Provides monitored and conditioned power, including soft-fusing, to the rest of the robotic system.
  • Distributes the intrasubsystem gigabit ethernet network when paired with the Master Capability Module.
  • Interfaces via MIL-STD-38999 connectors that conform to the Unmanned Ground Vehicle (UGV) Interoperability Profile (IOP).

More information about the AEODRS program is available at Unmanned System Technology.

MIT Enterprise Forum event will showcase Pittsburgh & Carnegie Robotics

On March 29th, the MIT Enterprise Forum is holding an event at the Alpha Lab Gear space in Pittsburgh's East Liberty neighborhood.  "Pittsburgh Presents Robotics: Why the World is Watching" will be moderated by David Kalson of Cohen & Grigsby and the panel will include:

  • Steve DiAntonio, President and CEO of Carnegie Robotics
  • John Bares, Founder of Carnegie Robotics and Director of the Uber Advanced Technology Center
  • Chris Moehle, Managing Director of Coal Hill Ventures
  • Jim Rock, CEO of Seegrid

You can register for the event at EventBrite.  For additional information please see this press release.


R&D program for ship firefighting uses the MultiSense SL

The MultiSense SL is being used on a firefighting robot named SAFFiR, for Shipboard Autonomous Firefighting Robot.  The Navy R&D grant is for motion planning software development what will be developed at Worcester Polytechnic Institute and uses an existing robot from Virginia Tech.

The video below shows the bipedal robot in some initial tests and some of the live 3D data from the MultiSense SL sensor integrated into the robot's head.


Sponsored by the Office of Naval Research (ONR), SAFFiR is a two-legged, or bipedal, humanoid robot designed to help researchers evaluate unmanned systems to support Sailors with damage control aboard naval vessels.

Product Support


The Carnegie Robotics' website has had a product support section for many months now, but we haven't highlighted it until now.  It has a wealth of information, including:

  • General product documentation, specifications, & interface drawings.
  • ROS driver documentation.
  • Release notes for Firmware, LibMultiSense, ROS driver, and Cloud Demos software package.
  • Example code on reprojection, calibration, and ROS transform tree usage.

I invite you to check it out at  We'd love to hear your ideas for additional information, code, data, etc to add to the support section.  


DRC Finals

Carnegie Robotics was on-hand at the DARPA Robotics Challenge Finals, or DRC on June 5th and 6th, 2015.  We had a booth in the DRC Expo area and spent two days talking to the the thousands of visitors and roboticists who attended and watched the exciting Trials.

We were excited to see the six Track B teams compete in the event.  All Track B teams use the Atlas robot from Boston Dynamics, which features a Carnegie Robotics MultiSense SL sensor as the main perception system -- or as one 6 year old in our booth excitedly noted, "Look Mom.. it's Atlas's head!"  The DRC Track B teams that competed:

Additionally, five other DRC teams built their own robots and deployed sensors from Carnegie Robotics, namely:

Steve DiAntonio, Matt Alvarado, and Chris Osterwood had a great time at the event.  It was quite something to see such a large crowd of people cheering when points were awarded to teams and groaning when robots fell.  

Visualization of DRC Finals from MIT Team

To give you a sense of how operators controlled these robots from remote stations, I recommend watching this visualization from MIT's DRC team.  It shows MultiSense SL data from their first finals run.  The video shows the SL's LIDAR data, sometimes colorized using the SL's on-board camera, stereo data from the SL, as well as overlays showing how MIT's software interprets the live data streams and allows operators to make decisions and navigate this complex (for a robot) course.

This video shows MIT's first DRC Finals Run on Friday the 6th, scoring 7 to 8 points.  After some faults on their seconds run, they finished sixth.  More information about their runs is on their website at

MIT releases LCM driver for MultiSense SL

The MIT DRC team has graciously published their Lightweight Communications and Marshalling (LCM) driver for the MultiSense SL, created by them as part of their work for the DARPA Robotics Challenge (DRC).  

They have also release a new video showing their ATLAS robot performing continuous path and step planning from just the stereo data of its' MultiSense SL.  

Enabled by Online Footstep Planning and Stereo Fusion. All execution here is autonomous MIT DARPA Robotics Challenge Team More Details: References: 1. Robin Deits and Russ Tedrake. Footstep planning on uneven terrain with mixed-integer convex optimization. In Proceedings of the 2014 IEEE/RAS International Conference on Humanoid Robots (Humanoids 2014), Madrid, Spain, 2014.

Carnegie Robotics & AutonomouStuff at RoboBusiness

Carnegie Robotics had Senior Engineering Staff on-site for RoboBusiness 2014 this past week in support of our distributor, AutonomouStuff.  The booth had a live demo of the MultiSense S7 running, our new S21 demo video on display, and we spent the days answering multitude of questions about our stereo sensors and technology.  

We were also honored to learn that the MutliSense S21 had been nominated for a Robotics Business Review 2014 Game Changer Award.  From their article:

The MultiSense S21 addresses one of the key long-term problems in robotics and automation: reliable 3D sensing of the world indoors and out, in a wide range of conditions, at a reasonable cost. This long-range, high-resolution, color 3D sensor has the potential to enable low-cost robotic, automation, and safety features for a wide range of applications including mining, construction, healthcare, and warehouse and factory automation.

For more details please see their article about the MultiSense S21:

Our new MultiSense S21 demo video shows the sensor operating in a wide-variety of outdoor scenes and several visualizations based on our Cloud Demo Software package, which is open source and slated for release in a few weeks.