A collaboration involving researchers at Waterloo Engineering and the Karlsruhe Institute of Technology (KIT) in Germany has produced an extensive open-source dataset of images to help advance automated manufacturing.
The dataset includes more than 200,000 images created both in the metaverse and in the real world, the largest of its kind for the development of artificial intelligence (AI) for vision-driven robotics in manufacturing environments.
The collection includes detailed annotations and has much greater diversity than existing datasets.
“It has huge potential to drive innovation and accelerate research in Industry 4.0 and robotics and AI by giving researchers and engineers the rich data source they need to really push this area,” said Alexander Wong, a systems design engineering professor at Waterloo.
The research team, recently a finalist for a best paper award at the IEEE International Conference on Automation Sciences and Engineering in Mexico City, also included postdoctoral researcher Yuhao Chen and master’s student Emily ZhiXuan Zeng from Waterloo Engineering, and Maximilian Gilles and Tim Robin Winter from KIT.
The project received funding from both the German and Canadian governments, and included multinational company Festo and DarwinAI, a Waterloo spinoff startup.
The team's paper, MetaGraspNet: A Large-Scale Benchmark Dataset for Scene-Aware Ambidextrous Bin Picking via Physics-based Metaverse Synthesis, can be found here.