• Dear Guest, Please note that adult content is not permitted on this forum. We have had our Google ads disabled at times due to some posts that were found from some time ago. Please do not post adult content and if you see any already on the forum, please report the post so that we can deal with it. Adult content is allowed in the glory hole - you will have to request permission to access it. Thanks, scara

Facebook now called Meta

Wow! Could I be cheeky- How is a project like that set up? What sort of coding language and architecture is used?
It was quite simple in this case as we were reasonably lucky, traditionally for this company their Dutch branch had always been keen on innovation and were quite well position for the challenge as they had a very modern warehouse management system that already used android tablets as picker devices and the requirement wasn't to develop a full blown augmented reality system, it was all about leaving the pickers hands free and speeding up the time it took to locate and verify an item was correct.

The WMS was already configured to send item, location and qty information to the android devices (these were either attached to carts for big stuff or handheld devices for the majority of the items) and this just by itself would provide on screen directions to the correct isle, stack and row all we had to do in the first instance was get these instructions to the glasses using the Google API.

The next bit was the clever bit, we already used RF and UF as positional sensors for isle and stack location (we expected the picker to count the vertical rows in the racks) but now we needed an optical feed (from the glasses) to provide the improved accuracy to be able to direct a picker to a 30cm x 30cm box in an isle that was at a minimum 30m long and 10m high. We used Oracle and ArcGIS software to convert the optical feed into geo-spatial co-ordinates, these were then fed along with the to the existing WMS data into bespoke service we developed that would basically convert the RF location and optical spatial data to a point in space with in the virtual warehouse (did I mention we had to scan the warehouse and model it) this combined data along with some picker biometrics (height) would enable the system to know where it was to within a few CM in the warehouse and what direction it was facing (every 5m in each isle we had a location code that could be scanned by the glasses or the picker device if they got lost).

Then it was just a simple job of designing how this data was displayed to the picker via the glasses and job done.

We did look at doing item recognition but thought it was a bit OTT, it was easier to just read the bar code on the item.

Hope that helps, I think this system has now been sold to the WMS vendor, but not sure.
 
It was quite simple in this case as we were reasonably lucky, traditionally for this company their Dutch branch had always been keen on innovation and were quite well position for the challenge as they had a very modern warehouse management system that already used android tablets as picker devices and the requirement wasn't to develop a full blown augmented reality system, it was all about leaving the pickers hands free and speeding up the time it took to locate and verify an item was correct.

The WMS was already configured to send item, location and qty information to the android devices (these were either attached to carts for big stuff or handheld devices for the majority of the items) and this just by itself would provide on screen directions to the correct isle, stack and row all we had to do in the first instance was get these instructions to the glasses using the Google API.

The next bit was the clever bit, we already used RF and UF as positional sensors for isle and stack location (we expected the picker to count the vertical rows in the racks) but now we needed an optical feed (from the glasses) to provide the improved accuracy to be able to direct a picker to a 30cm x 30cm box in an isle that was at a minimum 30m long and 10m high. We used Oracle and ArcGIS software to convert the optical feed into geo-spatial co-ordinates, these were then fed along with the to the existing WMS data into bespoke service we developed that would basically convert the RF location and optical spatial data to a point in space with in the virtual warehouse (did I mention we had to scan the warehouse and model it) this combined data along with some picker biometrics (height) would enable the system to know where it was to within a few CM in the warehouse and what direction it was facing (every 5m in each isle we had a location code that could be scanned by the glasses or the picker device if they got lost).

Then it was just a simple job of designing how this data was displayed to the picker via the glasses and job done.

We did look at doing item recognition but thought it was a bit OTT, it was easier to just read the bar code on the item.

Hope that helps, I think this system has now been sold to the WMS vendor, but not sure.

That is extremely enlightening thank you very much for taking the time out to explain. You’ve completely satisfied my curiosity. If only everything could be explained so well.

So you took the existing cataloguing, made a satnav* system and then used AR to highlight the exact position of an item in the warehouse via glasses or goggles?

And you self effacing my use the words ‘basically’ and ‘simply’ when setting out the really clever bits.

*I presume Arcgis is the satellite positioning? How stable/accurate is that? I thought it was more sketchy than resolving to cm scale?
 
That is extremely enlightening thank you very much for taking the time out to explain. You’ve completely satisfied my curiosity. If only everything could be explained so well.

So you took the existing cataloguing, made a satnav* system and then used AR to highlight the exact position of an item in the warehouse via glasses or goggles?

And you self effacing my use the words ‘basically’ and ‘simply’ when setting out the really clever bits.

*I presume Arcgis is the satellite positioning? How stable/accurate is that? I thought it was more sketchy than resolving to cm scale?
Cheers

I like your synopsis, wish we had used that when selling the idea to the directors :)

ArcGis is a suite of tools that allow you to manipulate spatial data such as real world imagery and remotely sensed data (represented within a data store as vector points), not just location data
 
Sounds great but I'm wondering, why couldn't you just give every aisle/shelf/row/bay/stack a code that is written on it, like a cinema seat. and then have an earpiece say 7-4-24-3-2 and they walk to that location?
 
It used to done a bit like that years ago, but the pickers process had to become more efficient, so now a picker will actually be picking multiple lists at once and much faster than someone could read off location ids. Even a 1 second improvement can equate to huge efficiency gains multiplied across hundreds of pickers over a year


Sent from my iPad using Fapatalk
 
Back