View Full Version : Neato Xv-11 Lidar

07-18-2011, 02:34 PM
I designed a board to interface with the Neato Lidar and do the TTL-USB conversion as well as supply the PWM signal to the motor to maintain the speed. I have a PIC on board that watched the data and varies the PWM based on the speed bytes. There are lots of jumpers on the board so I can easily switch from monitoring the pic on board to monitoring the Lidar. Anyway my lidar has the 2.1 version FW on it. I can capture the header and the speed bytes. I combine the speed bytes to get a 16bit int. the weird thing is that the speed number goes down as the speed of the lidar goes up. So the speed reported is not an RPM type value but something else. My biggest question is what should the value be?
Going from this
I tried maintaining the speed at 55555 but that is impossible because that is a very slow speed. The lidar is barely turning at all. 53674 seems to be a more reasonable number(just watching it spin). What I would like to know is what the perfect speed should be. Anybody know what the numbers really represent and what the target should be? I should have snooped the data from my neato before I disassembled everything.

Info on my board will be posted here.


07-18-2011, 04:22 PM
I get the feeling you may have the byte order swapped. It would appear the older firmware of the ros tutorial transmits/receives the Least-Significant Byte first (the order of first to last byte received is from right to left on their tutorial). So, if you were sending it data...

LSByte First
'55555' is sent as 0x03 then 0xD9 which is then interpreted by the lidar as '55555'.
'53674' is sent as 0xAA then 0xD1 which is then interpreted by the lidar as '53674'.

MSByte First
'55555' is sent as 0xD9 then 0x03 which is then interpreted by the lidar as '985'.
'53674' is sent as 0xD1 then 0xAA which is then interpreted by the lidar as '43729'.

Might be useful, but probably not. Others will know far better than me.

07-18-2011, 06:32 PM
Hi Ringo,

Short answer: 53656

Long answer: I got this value from some of the initial sparkfun scans (no_shield) that helped trigger the whole reverse-engineering spree. They have been recorded using a logic analyzer directly on the full robot, working normally, so I guess it's good.

You should be ok with such a value, but in practice the LDS can work both slower or faster than that, as long as it's at a stable speed.

Also, you are correct when you mention that the "speed" value varies inversely from the speed of the LDS: here is the "speed" values I had on a test where I let it run freely, then slowed it down by putting my finger on it (firmeware 2.1 too, obviously).
http://forums.trossenrobotics.com/gallery/files/4/7/9/2/lds_speed_when_slowed_down_-_fw2.1.jpg (http://forums.trossenrobotics.com/gallery/showimage.php?i=4046&original=1&c=3)
These values are very probably a measure of the time it takes for a full rotation, in a unit that made sense to them at the time. I'm pretty sure it is offset by a large value (maybe the D of the MSB is just an offset). In the test just above, I slowed the LDS by a factor of roughly 2.

Very nice board by the way :)

07-18-2011, 09:20 PM
Cool, so I'm not nuts :-)
I'll set it to run at 53674 and maybe see if I can use the secondary debug port to tweak the value while it is running.