HACKOBAR_item
[ALLENAI]score: 0.24

AI2 releases MolmoAct 2 for robot manipulation tasks

May 6, 2026
AI2 released MolmoAct 2, a vision-language-action model built for bimanual robot manipulation, accompanied by a large open dataset of real-world demonstrations. The model improves action reasoning over its predecessor with enhanced spatial grounding and task generalization. Robotics researchers and embodied AI teams should prioritize evaluation, as open bimanual manipulation datasets at this scale remain rare. Direct comparisons to RT-2 or OpenVLA on dexterous tasks will be the real benchmark test.