ADLINK Technology Inc. has announced that its edge computing platform for artificial intelligence is now fully compatible with LIPS Corp.’s perception development kit, which is based on NVIDIA‘s Isaac Perceptor. This collaboration integrates ADLINK’s DLAP-411-Orin with LIPS’ edge AE Active Stereo 3D camera series, targeting developers working on mobile robots and AI vision applications in sectors like manufacturing and warehouse logistics. The partnership aims to streamline the development process for autonomous mobile robots (AMRs) by providing a robust hardware stack that enhances perception capabilities.
Previously, integrations between edge computing platforms and perception kits often faced challenges in compatibility and performance optimization. This new collaboration between ADLINK and LIPS represents a significant improvement in addressing these issues, offering a more seamless and efficient solution for developers. By combining their technologies, both companies aim to set a new standard in the development of autonomous systems.
How Does the Edge Platform Enhance AMR Performance?
The ADLINK DLAP-411-Orin, powered by the NVIDIA Jetson AGX Orin module, delivers up to 275 trillion operations per second (TOPS) of AI computing power. This high-performance edge computing capability supports multi-camera synchronization, which is essential for advanced vision applications in mobile robots. Additionally, the integration with LIPSedge AE Active Stereo 3D cameras provides superior 3D depth sensing and image capture, offering a wider field of view and higher resolution compared to traditional lidar systems.
What Applications Benefit from the LIPSAMR Perception DevKit?
The LIPSAMR Perception DevKit, which includes the DLAP-411-Orin and LIPSedge AE Active Stereo 3D cameras, is designed to support a variety of industrial applications. These include automated quality inspection in manufacturing, material handling, and sorting in warehouse logistics, as well as outdoor inspection and maintenance tasks. The DevKit’s AI-driven depth sensing and 3D occupancy grid mapping enable precise navigation and environmental interaction, making it a valuable tool for developing sophisticated AMR applications.
How Does the Collaboration Address Development Challenges?
“We are excited to partner with LIPS on the AMR 3D x AI Perception Solution, integrating LIPS’ 3D cameras with our DLAP-411-Orin and leveraging the features of NVIDIA Isaac Perceptor,”
said Ethan Chen, general manager of ADLINK’s Edge Computing Platform business unit. This collaboration addresses common challenges in AMR integration by providing a plug-and-play solution that has passed NVIDIA certification, thereby reducing development time and facilitating easier incorporation into existing robotic systems.
Moreover,
“We are pleased to collaborate with NVIDIA and ADLINK,”
stated Luke Liu, CEO of LIPS Corp. This partnership not only enhances the capabilities of AMRs but also simplifies the integration process for developers, allowing them to focus on creating innovative solutions rather than dealing with compatibility issues.
The combined efforts of ADLINK and LIPS are expected to significantly impact the development of autonomous systems by providing a comprehensive and efficient platform. This integration offers a competitive edge to developers working on mobile robots, enabling them to harness advanced AI and 3D perception technologies with greater ease and reliability.