The Future of Manufacturing
Augmented reality (AR) may be the future of manufacturing, as firms seek innovative solutions to existing industrial problems. Industrial uses for AR are directed towards improving key areas in efficiency, quality and productivity.
AR relies on the accurate position tracking of objects and visual processing. Current AR solutions require an environment with sufficient available trackers, or differentiating visual features. These allow the user interface (by means of AR algorithms) to calculate the precise position of the camera in relation to the physical environment. Real objects identified in the camera feed are then overlaid with a 3D virtual model, to generate the resulting AR effect.
Trackers can help to situate the correlated 3D models accurately, but they are subject to environmental influences. Visual features must be clear enough for accuracy, and can often be affected by line of sight and the quality of light. One way of improving this is to attach stickers as visual markers, on or alongside objects, to enhance clarity for the trackers and enrich the definable environment. Alternatively, embedded 3D cameras can improve the tracking mechanism by providing an indication of depth between the target object and the camera, providing an additional 3D mapping layer.
Most use cases integrate AR with those parts of the production line where physical machinery, moving parts and robots can most feasibly be linked to 3D graphic models. Instances include instructions for various processes such as asset maintenance, assembly, validation and quality processing. Using a 3D design model for validation could reduce both the process cycle time, and the potential for human error. An operative on the shop floor might use AR to identify which part to use next in an assembly sequence – or to validate a robotic program with a part generated by AR rather than a physical part. 3D models may also be overlaid onto physical parts for the purposes of quality inspection, contrasting the "as-built" product with the "as-planned" version.
There are several challenges to developing AR for industry. Most industrial environments do not offer clear trackers on all available surfaces, and the lighting conditions may well be inadequate. There may be significant variations in scale or colour between the physical objects and their 3D representation. Employing visual stickers to enhance the trackers in an industrial environment might involve excessive time spent in preparation, potential inaccuracies, and user error. Full coverage is not always possible, so this AR solution can be only partial, and would not overcome the issues of data revision.
3D cameras are also only a partial solution, because there are many target environments that have uniform surfaces and provide no differential depth information. It is also expensive to mount an array of cameras, and adds significantly to the setup in terms of hardware and infrastructure. There is also a divergence in AR research according to what user interface is employed to implement them. One avenue is leading towards wearable devices, usually head-mounted, while the other is aimed at mobile devices. Workers on the shop floor are generally reluctant to wear head-mounted devices for the amount of time that would be required on the production line. They can limit a person's field of view, cause discomfort and disorientation issues, and raise health concerns.
Siemens tackle the problem
Existing AR solutions were somewhat limited for industrial use, and new technology was required for the future of manufacturing. Siemens discovered that combining current VR technologies could achieve greater AR accuracy, fusing computer-vision algorithms with reliable, low-cost inertia sensors and laser lighthouses. From this, they developed a system called PointAR. Here, trackers are still used on physical fixtures, with each PointAR unit carrying out a unique calibration process that determines the offset between its tip, the camera and the trackers. A 3D model can then be accurately overlaid onto the physical part. This initial positioning is required when the AR application is launched, aligning the real-world coordinate system with that of the 3D model, and allowing the camera to move freely.
After reviewing various tracking methodologies, Siemens developed PointAR as a unique hybrid. Computer-vision technology and sensors were combined to produce an image overlay with an absolute position accuracy of 0.5-2mm and angular accuracy of 0.05-0.2º. This system obviated any environmental factors and lighting or background influences. It is sufficiently accurate to validate the performance of a manual operation, which is a key factor in a robust AR solution. Successful initial tests have been completed, and PointAR is being tested on a German production line.
Get More From Rowse Straight To Your Inbox