Presentation of KNAPP's KiSoft VISION

le 22/06/2010 par Damien Joguet
Tags: Évènements

OCTO has been designing state-of-the-art IT architecture for more than 12 years. Recently we realized that user interfaces needed to be improved in order to bring more value to users. That's why we now work on usability of IT systems. Add to that an interest for innovation processes and the will to partner with our clients  "from concept to cash", and you will have a pretty good picture of the OCTO DNA.

Those traits explain why we were particularly interested by this presentation of KiSoft VISION from KNAPP for order picking assisted by augmented reality.

We've decided to invite KNAPP at OCTO to present their product and the process that led to it during one of our "Supply Chain Management School" session. We've also invited some of our clients to join us and discover what we view as part of a larger trend in the apparition of new types of user interfaces.

This proposal was met with great enthusiasm both from Knapp and our clients. As a result, Birgit Huber and Peter Stelzer flew from Austria to Paris on June 17 to make a presentation in front of more than 25 people in OCTO premises.

There are many different solutions to setup an order picking system from totally manual to fully automated (for example ASRS : Automatic Storage and Retrieval Systems). For manual picking, paper picking is still prevalent but error-prone, therefore there are different ways to assist the pickers. Voice directed picking is a pretty mature technology but it has some drawbacks : the need to translate the instructions, the impossibility to check that the picked item is the correct one, ... In comparison, an augmented reality system should bring a "universal" interface and the camera attached to the system could help validate the picked items. However, that must not come with an increased cost for additional hardware in warehouses. Those are some of the constraints the KNAPP team had to address for the KiSoft VISION project.

We discovered the predictions that Peter made when he launched the project 3 years ago when the technology was not ready yet and how those predictions became true (or not), how the system is designed (hardware and software), how the challenges were overcome, ... He presented the architecture of the system, which is based on 2D-markers that a camera detects to guide the picker in the warehouse, highlight the position in the shelf where the items are to be picked and display the number of items to pick. Thanks to the camera, the system can validate that the barcode of the picked item is the correct one.

Peter then demonstrated the product with a prototype, and presented the next steps for the technology.

The session was followed by a buffet during which the participants could try the glasses and test the system with markers and tagged items.

Two pictures will help you to better understand the system.

Kisoft Vision Glasses (prototype)

Note that it's still a prototype: in this version, the user sees the environment thanks to screens inside the glasses. In the next version, the user will see through the glasses and the information will be inserted on the glasses (just like with Head Up Displays that can be found in some cars).

Augmented Reality for picking

When "seeing" markers the system inserts virtual images to guide the picking (during the demonstration the image was displayed on the screen for everyone to see). The item the camera is showing is the small box with a marker on the left hand side of the table, an arrow hovers over it to indicate that this is the item to pick.

Many thanks to Birgit and Peter for the great presentation and their willingness to explain the details behind this great system !

When "seeing" markers the system inserts virtual images to guide the picking (during the demonstration the image was displayed on the screen for everyone to see). The item the camera is showing is the small box with a marker on the left hand side of the table, an arrow hovers over it to indicate this is the item to pick.