msadowski blogJekyll2025-05-20T10:53:24+02:00https://msadowski.github.io/Mateusz Sadowskihttps://msadowski.github.io/[email protected]https://msadowski.github.io/build-toolkits2025-05-20T00:00:00-00:002025-05-20T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>Recently, I’ve been thinking about some good practices in robotics software development and realized that one of the most impactful is the one I learned early on. The lesson is: <strong>Build Toolkit(s)</strong>.</p>
<!-- more -->
<h2 id="background">Background</h2>
<p>In 2014, I wrapped my degree and moved to the UK to join SkyCircuits (now <a href="https://callenlenz.com/">Callen-Lenz</a>), a company that at the time was developing high-end autopilot for drones. What set us apart at the time was a modular control architecture, on-board script support, and out-of-the-box setup for any platform (multirotors, fixed-wings, helicopters).</p>
<p>We built three user-facing pieces of Ground Control Station (GCS) tools:</p>
<ul>
<li>Plan - mission planning</li>
<li>Flight - mission execution</li>
<li>Toolkit - a power-user interface to the autopilot</li>
</ul>
<p>As you probably guessed from the title, the reason we’re here is the last item on the list.</p>
<figure class="center">
<img src="/images/toolkit/toolkit.png" alt="User interface of SkyCircuits Toolkit" />
<figcaption>SkyCircuits Toolkit</figcaption>
</figure>
<p>SkyCircuits Toolkit was a power user interface that allowed plotting data in real-time (super handy when tuning PIDs) and updating any variable or script on the autopilot. Very useful for all the teams within the company and any power-users that needed it.</p>
<p>I didn’t realize it at the time, but this was a lesson that stuck with me. And ‘Toolkit’ is such a great name that I always tend to use it.</p>
<blockquote>
<p>“… and we will build a power-user interface exposing all our parameters, and we are going to call it Toolkit”</p>
</blockquote>
<p style="text-align:right">Me, on at least two occasions with two different clients</p>
<h2 id="create-software-for-power-users">Create software for power-users</h2>
<p>Many electromechanical systems require a unique set of parameters that you need to set for each device that you build, no matter how much cookie-cutter-like is your approach. The configuration you may store are PID controller terms, IMU offsets, camera calibration values, device names and so on.</p>
<p>Having worked with over 50 companies, I’ve repeatedly seen software developers involved in provisioning new devices and tuning parameters by hand. This approach is OK when you are prototyping, but once you start scaling up, this becomes an issue.</p>
<p>To illustrate, let’s say the PID controller values of your autopilot are stored in a database, which is the only interface for changing these values. In such a situation, the following request would not be unheard of:</p>
<blockquote>
<p>Hey Mat, we changed the ESCs on our multirotor and now it’s a little bit wobbly in hover. Can you tune it?</p>
</blockquote>
<p>A simple request like this can cost you <strong>30 minutes</strong> at best (that includes the <a href="https://robinweser.com/blog/the-hidden-cost-of-context-switching">context switching cost</a>). At worst, it might require a field trip and take half a day or more. That’s time your developer isn’t contributing to the product.</p>
<p>By creating power-user interfaces, you can level your robot set-up game between your software, mechanical and electrical teams. Asking a mechanical engineer to learn ROS 2, SQL, or configuration-storage-solution-du-jour is a big ask. Abstracting these interactions into a piece of software that remains stable regardless of the underlying implementation will make all your team equally productive when it comes to robot set-up.</p>
<p>What’s the next step after building your Toolkit? Naturally, automating the entire provisioning process!</p>
<p><a href="https://msadowski.github.io/build-toolkits/">Build Toolkits</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on May 20, 2025.</p>
Robotics https://msadowski.github.io/7-years-remote-robotics-consulting2025-04-23T00:00:00-00:002025-04-23T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<h2 id="intro">Intro</h2>
<p>If you’ve been following my blog for a while, you might have seen my earlier blog posts summarizing my experience as a Remote Robotics Consultant. In case you’d like to start from the beginning, here is the ordered list of my posts:</p>
<ul>
<li><a href="https://msadowski.github.io/one-year-of-robotics-consulting/">One Year of Working as a Robotics Consultant</a></li>
<li><a href="https://msadowski.github.io/2-years-remote-consulting/">Thoughts on 2 Years as a Remote Robotics Consultant</a></li>
<li><a href="https://msadowski.github.io/3-5-years-remote-consulting/">Remote Robotics Consulting - 3.5 Years In</a></li>
<li><a href="https://msadowski.github.io/5-years-remote-robotics-consulting/">Remote Robotics Consulting - A Five-Year Retrospective</a></li>
</ul>
<p>In hindsight, there are some choices that I made that turned out suboptimal, but if I turned back time, I’m not sure if I would be able to make better decisions. Let’s see if there are some lessons to be learned!</p>
<!-- more -->
<p>The section below describes my last long-term contract, if you’d like to jump into some meaty advice, see the next section, <strong>Lessons Learned</strong>.</p>
<h2 id="update-on-my-previous-blog-post">Update on my Previous Blog Post</h2>
<p>In my <a href="https://msadowski.github.io/5-years-remote-robotics-consulting/">last consulting update</a>, I ended my blog post with a bit of a teaser of what is next and in my increased involvement with <a href="https://www.dte.ai/">DTE</a>, an Icelandic company building laser-shooting robots for molten metal analysis. I considered the company’s product the pinnacle of tackling DDD (Dirty, Dull, Dangerous). The smelters themselves and metal processing in general are no joke, and people get injured in these places very often.</p>
<figure class="center">
<img src="/images/5_yr_consulting/image5.jpg" alt="Red pill blue pill on hands" />
<figcaption>Do you take the leap? - My thinking at the time</figcaption>
</figure>
<p>Here was my thinking over two years ago - if I was given a choice to join a company I believe in, should I drop my consulting operation, and go full in?</p>
<p>It turned out dropping consulting was not necessary. The alternative was going through an intermediate company. The math of becoming an employee of an umbrella company was quite interesting, I would double my hours, but because of all the taxes and fees involved, I would end up earning 100 USD more a month.</p>
<p>Given this, we decided I would remain a consultant, with all the perks that come with a job description (no fixed schedule, can take other clients) and on top of that, receiving stock options, and having a three-month notice period.</p>
<p>One discussion I will always cherish from my time at DTE was when I was chatting with my manager at the time, Lúðvík, and the conversation went along the lines of:</p>
<ul>
<li>“Sure, I can lead the robotics team, but it might be difficult since I’m not working full-time” (was at 80 hours per month back then)</li>
<li>“What? Based on your output, I thought you were full-time”</li>
</ul>
<figure class="center">
<img src="/images/7_yr_consulting/image1.png" alt="A cat staring at a pancake" />
<figcaption>This is me</figcaption>
</figure>
<p>I indeed felt like I’ve reached my peak productivity with that company.</p>
<p>As I’m publishing this blog post, I wrapped up my cooperation with DTE. It’s been an interesting and very intense period of my life, and as a result, I can tell people that I’ve worked on heavy metal robots that shot lasers.</p>
<h2 id="lessons-learned">Lessons Learned</h2>
<p>Of course, there was going to be a lessons learned section. Today, I have for you a mix of business and technical advice:</p>
<ul>
<li><strong>Retainers</strong> are amazing - it feels great to lock in the hours at the beginning of the month. If you go this direction, consider how to handle under/over time (charge for it at the beginning of the next month, shift hours around, take them on as holiday)</li>
<li>If you are entering a long-term contract, having a <strong>notice period</strong> in the contract might be a good idea for both parties (in my case, it gave me three months of runway to have a think about what’s next)</li>
<li>Think of your <strong>insurance</strong> - metal smelters are very dangerous, which meant that if I was to go to one on behalf of DTE, I would not be covered by their insurance (since I’m also external to DTE). Always consider these in any hazardous environment!</li>
<li><strong>Eliminate repetitive busy work</strong> as soon as possible - developers should develop, not perform mundane, repetitive tasks. If this is the case, automate the tasks or create tools to reduce the barrier for performing these tasks</li>
<li>If you don’t have time to <strong>create content</strong> for your main service, then what are you even doing? - this one stings a bit. When I went all-in with the long-term contract, I didn’t have time or energy for side explorations, and this blog has been dormant. In hindsight, I should’ve taken fewer hours and played more</li>
</ul>
<h2 id="upwork">Upwork</h2>
<p>In every single of my updates, I mentioned Upwork, so will do it here too. It feels that Upwork ended when they <a href="https://boostlancer.net/blog/boostlancer-thrives-despite-upworks-removal-of-rss-feed-feature">disabled their RSS feeds</a>.</p>
<p>For the past three months, I’ve been browsing the jobs in ROS, robotics and drone categories, and the quality of jobs went downhill from the last time I was actively using the platform. Back in the day, you would see 1-3 quality projects a month, and in the past three months… nothing. They are also <a href="https://support.upwork.com/hc/en-us/articles/39620058162963-Variable-Freelancer-Service-Fee">upping their fees</a> again to a variable 0-15% (want to bet it will be closer to 15%?)</p>
<p>Upwork is heavily pushing their AI tools for proposal writing, and I think AI will be doing a big disservice to the platform. When I created a job un Upwork the other day, I had a couple of long-form proposals within a minute of posting it. Clearly, AI bots are hard at work on Upwork, and I don’t think this is helping them.</p>
<h2 id="weekly-robotics">Weekly Robotics</h2>
<p>In June last year, I took a sabbatical from my newsletter <a href="https://www.weeklyrobotics.com/">Weekly Robotics</a>. At the time, I was focusing on reconstructing our apartment, and preparing to welcome my daughter into the world. I was also hoping to finish Baldur’s Gate 3 but didn’t have many chances to play since that time.</p>
<p>Today, the newsletter is back in strength. I started self-hosting the e-mail infrastructure and automated the shit out of issue creation. At one point, I was thinking if I should use AI to create feature description but decided against it since I’m quite a bit tired of all the AI slop on the Internet.</p>
<p>My goal for the newsletter for 2025? Grow it to 20,000 subscribers across e-mail and LinkedIn!</p>
<h2 id="whats-next">What’s next</h2>
<p>I’m tempted to try out something new, and offer “Senior Robotics Internship package” that as foresee would work as follows:</p>
<ul>
<li>Fixed three month cooperation</li>
<li>Working on a specific problem the company is facing</li>
<li>Throwing in Weekly Robotics sponsorship as a bonus</li>
<li>Can be paid in money or equity or a mix of thereof</li>
</ul>
<p>I think it would make for an interesting experiment, If you’d like to try it out, feel free to <a href="mailto:[email protected]">get in touch</a>!</p>
<p>Until then, we will keep playing, and exploring coverage path planners.</p>
<figure class="center">
<img src="/images/7_yr_consulting/path_planning.jpg" alt="A baby checking out a Roomba" />
<figcaption>Learning everything there is on coverage path planning</figcaption>
</figure>
<p><a href="https://msadowski.github.io/7-years-remote-robotics-consulting/">Remote Robotics Consulting - Seven Years In</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on April 23, 2025.</p>
Robotics https://msadowski.github.io/Roboticist-visits-fosdem-20252025-02-06T00:00:00-00:002025-02-06T00:00:00+01:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<h2 id="intro">Intro</h2>
<p>This year, I paid my first visit to FOSDEM (Free and Open Source Software Developers’ European Meeting) and helped organize the Robotics and Simulation Devroom. This blog post summarizes my experience.</p>
<!-- more -->
<p>Last year, Kimberly McGuire approached me about organizing a robotics devroom at FOSDEM. One thing led to another, and we assembled a dream team of organizers for the “Robotics and Simulation Devroom”:</p>
<ul>
<li><a href="https://www.linkedin.com/in/arnaud-taffanel-a5750211">Arnaud Taffanel (Bitcraze)</a></li>
<li><a href="https://www.linkedin.com/in/fred-g-5388a311/">Fred Gurr (Eclipse Foundation)</a></li>
<li><a href="https://www.linkedin.com/in/knmcguire/">Kimberly McGuire (Independent)</a></li>
<li><a href="https://www.linkedin.com/in/lucaschiesa/">Lucas Chiesa (Ekumen)</a></li>
<li><a href="https://www.linkedin.com/in/mateuszsadowski/">Mat Sadowski (me)</a></li>
</ul>
<figure class="center">
<img src="/images/fosdem2025/organizers.jpg" alt="Devroom organizers" />
<figcaption>Organizers of the Robotics and Simulation Devroom</figcaption>
</figure>
<p>and just like that, I set off for a <a href="https://en.wikipedia.org/wiki/Club-Mate#Culture">Club Mate</a>-fueled endeavor where hackers meet open-source—or should I say, make open-source possible.</p>
<p>What sets FOSDEM apart from other conferences I attended in the last couple of years is the fact that the event is free to attend. I think it does not get more open-source than this.</p>
<h2 id="the-robotics-and-simulation-devroom">The Robotics and Simulation devroom</h2>
<figure class="center">
<img src="/images/fosdem2025/fosdem_header.jpg" alt="Header Image of our Devroom" />
<figcaption>Graphics for our room by <a href="https://www.michalkalina.com/">Michał Kalina</a></figcaption>
</figure>
<p>Our room hosted the first Robotics devroom in FOSDEM’s history. In the last section of this blog post, I’ll provide a list of robotics-related talks from previous FOSDEMs. We had a room for a half-day and managed to squeeze in 14 talks in less than four hours. This is largely due to many presenters agreeing to turn their talks into lightning talks. The room was full for most of the time, with people occasionally queuing outside to enter.</p>
<h2 id="the-talks-i-found-interesting">The talks I found interesting</h2>
<p>Below is a short list of talks that I found interesting and attended:</p>
<ul>
<li><a href="https://fosdem.org/2025/schedule/track/robotics/">The whole Robotics and Simulation track</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6024-satnogs-comms-an-open-source-communication-subsystem-for-cubesats/">SatNOGS-COMMS: An Open-Source Communication Subsystem for CubeSats</a> - stepping outside of my comfort zone and going more embedded. Some very interesting lessons learned about using Zephyr-RTOS</li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-5525-the-road-to-open-source-general-purpose-humanoids-with-dora-rs/">The road to open source General Purpose Humanoids with dora-rs</a> - very nice demo with Pollen Robotics Reachy robot and, hopefully, a step towards general robots</li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6299-exploring-open-source-dual-a-b-update-solutions-for-embedded-linux/">Exploring Open Source Dual A/B Update Solutions for Embedded Linux</a> - good overview on strategies for reasonable firmware update workflows and overview on what’s out there</li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-4146-discovering-the-magic-behind-opentelemetry-instrumentation/">Discovering the Magic Behind OpenTelemetry Instrumentation</a> - trying to expand my horizons on logging. Implementing this with ROS would make a really nice side project</li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6092-what-can-pyarrow-do-for-you-array-interchange-storage-compute-and-transport/">What can PyArrow do for you - Array interchange, storage, compute and transport</a> - I’ve heard of Arrow quite a bit lately and I’m excited to look more into its IPC capabilities and zero-copy approach. Did you know that with Arrow you can seamlessly switch between Numpy, Pandas, and other libraries with zero-copy using Arrow?</li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6548-programming-ros-2-with-rust/">Programming ROS 2 with Rust</a> - Julia made a nice presentation in the Rust room on using ROS 2 with Rust</li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-5088-lessons-from-rewriting-systems-software-in-rust/">Lessons from rewriting systems software in Rust</a> - Some interesting insights on rewriting software that I believe does not only apply to Rust. One point that drove it home was depending on other’s people code</li>
</ul>
<figure class="center">
<img src="/images/fosdem2025/reachy.jpg" alt="Reachy Robot by Pollen robotics" />
<figcaption>Reachy used for a demo in the dora-rs presentation</figcaption>
</figure>
<p>Here is a list of talks I didn’t get to see but that I’ll catch up on once the videos are available:</p>
<ul>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-5760-zephyr-rtos-roasting-party/">Zephyr RTOS Roasting Party</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6300-using-embedded-rust-to-build-an-unattended-battery-powered-device/">Using embedded Rust to build an unattended, battery-powered device</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6341-hugging-face-ecosystem-for-local-ai-ml/">Hugging Face ecosystem for Local AI/ ML</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-4801-apache-arrow-the-great-library-unifier/">Apache Arrow: The Great Library Unifier</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6660-from-supercomputer-to-raspberry-pi-building-open-source-polish-language-models/">From Supercomputer to Raspberry Pi: Building Open Source Polish Language Models</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-4257-lessons-from-10-years-of-certifying-open-source-hardware/">Lessons From 10 Years of Certifying Open Source Hardware</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-4668-augurs-a-time-series-toolkit-for-rust/">Augurs: a time series toolkit for Rust</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-5470-building-a-watt-meter-esp-rs-and-a-rocket-backend/">Building a watt-meter esp-rs and a rocket backend</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-5996-automating-low-level-firmware-validation-with-robot-framework/">Automating Low-Level Firmware Validation with Robot Framework</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-6384-infra-for-drones-lessons-learned-from-15-years-of-open-source-robotics-/">Infra for Drones: Lessons learned from 15 years of open source robotics</a></li>
<li><a href="https://fosdem.org/2025/schedule/event/fosdem-2025-5586-rtcp-racecars-video-and-5g/">RTCP, Racecars, video and 5g</a></li>
</ul>
<h2 id="tipstricks">Tips&Tricks</h2>
<h3 id="high-quality-proposals-wanted">High Quality Proposals Wanted</h3>
<figure class="center">
<img src="/images/fosdem2025/robbie.jpg" alt="An image of a Robot Leaflet" />
<figcaption>Robbie developed by Kimberly to bravely promote our devroom during the event</figcaption>
</figure>
<p>This year, while organizing the Robotics and Simulation devroom, we received 24 proposals and had to make tough choices to fit them into half a day. We selected only the best and asked some speakers to condense their presentations into 5-minute lightning talks.</p>
<p>Some advice for next year’s applicants:</p>
<ul>
<li>Make a super high-quality proposal. You are competing for a speaking slot with world-class roboticists, a two-sentence abstract won’t cut it, no matter the reputation of your project</li>
<li>Don’t use AI to write your proposal (reviewers can often tell, and you get huge minus points for that)</li>
<li>It’s always a good idea to submit a link to your repository in the proposal. It’s great if it shows some history of development and it’s not just a handful of commits</li>
</ul>
<h3 id="accommodation">Accommodation</h3>
<p>I booked the accommodation within walking distance of the venue, thinking it made perfect sense to be as close to it as possible. It turns out the right question to ask yourself is: Are you going to hang out with someone after the event? If you do, where will it happen? Would you prefer to commute early in the morning or at night?</p>
<p>On Friday and Saturday, I spent the evenings near the city center of Brussels with my fellow co-organizers and had to commute back at night towards the campus. In hindsight, I would prefer to sleep closer to the center and get to campus early.</p>
<h3 id="making-it-to-see-the-talks">Making it to see the talks</h3>
<p>Often, the rooms are packed! If you find yourself going to a talk that might be popular, you get inside the room earlier. Once the rooms are full, you are not supposed to enter them for safety reasons.</p>
<h3 id="taking-notes">Taking notes</h3>
<p>When we organized the room, we asked the speakers to upload their slides to Pretalx. This meant that the slides showed up in the talk description and as an attendee, you could download it and make notes right there on the slides. If I just had my tablet with me!</p>
<h2 id="list-of-robotics-adjacent-talks-throughout-fosdem-history">List of robotics-adjacent talks throughout FOSDEM history</h2>
<p>2016</p>
<ul>
<li><a href="https://archive.fosdem.org/2016/schedule/event/testing_robots_in_the_cloud/">Simulating Humanoid Robots in the Cloud: the testing behind the biggest world competition</a></li>
</ul>
<p>2017</p>
<ul>
<li><a href="https://archive.fosdem.org/2017/schedule/event/loco_positioning_crazyflie/">Loco Positioning: An OpenSource Local Positioning System for robotics</a></li>
</ul>
<p>2018</p>
<ul>
<li><a href="https://archive.fosdem.org/2018/schedule/event/autonomous_robot/">How to build autonomous robot for less than 2K€</a></li>
<li><a href="https://archive.fosdem.org/2018/schedule/event/rusty_robots/">Rusty robots; Programming a self-balancing robot in Rust</a></li>
</ul>
<p>2020</p>
<ul>
<li><a href="https://archive.fosdem.org/2020/schedule/event/ema_ros2_evolution/">ROS2: The evolution of Robot Operative System</a></li>
<li><a href="https://archive.fosdem.org/2020/schedule/event/iotnuttx/">Making an IoT robot With NuttX, IoT.js, WebThings and more</a></li>
<li><a href="https://archive.fosdem.org/2020/schedule/event/ema_iceoryx/">Introduction to Eclipse iceoryx; Writing a safe IPC framework for autonomous robots and cars</a></li>
</ul>
<p>2023</p>
<ul>
<li><a href="https://archive.fosdem.org/2023/schedule/event/fossbot/">FOSSbot: An open source and open design educational robot (Lightning Talk)</a></li>
</ul>
<p>2024</p>
<ul>
<li><a href="https://archive.fosdem.org/2024/schedule/event/fosdem-2024-2898-controlling-a-6-degree-robot-arm-using-a-48k-zx-spectrum/">Controlling a 6 degree Robot Arm using a 48K ZX Spectrum</a></li>
<li><a href="https://archive.fosdem.org/2024/schedule/event/fosdem-2024-3225-dora-rs-simplifying-robotics-stack-for-next-gen-robots/">Dora-rs: simplifying robotics stack for next gen robots</a></li>
</ul>
<p>These are all the thoughts I have on FOSDEM 2025. See you in Brussels next year!</p>
<p><a href="https://msadowski.github.io/Roboticist-visits-fosdem-2025/">A Roboticist Visits FOSDEM 2025</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on February 06, 2025.</p>
Robotics https://msadowski.github.io/5-years-remote-robotics-consulting2023-08-05T00:00:00-00:002023-08-06T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<h2 id="intro">Intro</h2>
<p>If you’ve been following my blog for a while, you might have seen my earlier blog posts summarizing my experience as a Remote Robotics Consultant. In case you’d like to start from the beginning, here is the ordered list:</p>
<ul>
<li><a href="https://msadowski.github.io/one-year-of-robotics-consulting/">One Year of Working as a Robotics Consultant</a></li>
<li><a href="https://msadowski.github.io/2-years-remote-consulting/">Thoughts on 2 Years as a Remote Robotics Consultant</a></li>
<li><a href="https://msadowski.github.io/3-5-years-remote-consulting/">Remote Robotics Consulting - 3.5 Years In</a></li>
</ul>
<p>In this blog post, I will highlight some of the updates since my last blog and offer some advice that I hope will be useful for anyone looking to get into technical consulting.</p>
<p>Strap in, and let’s go!</p>
<!-- more -->
<h2 id="update-on-my-previous-blog-post">Update on my Previous Blog Post</h2>
<p>Last time, I mentioned multiple things that are now out of date:</p>
<ul>
<li>I no longer rent out a workshop. Focusing more and more on software projects, I practically stopped using it. The 30-minute commute one way did not help either, so I resigned from it at the beginning of this year.</li>
<li>Currently, approximately 2% of my projects come from Upwork. They are only short-term consultations. The rest are direct projects with clients.</li>
<li>I solved the context-switching problem - I haven’t had issues with it since my last post, so that’s great news. What helped here was turning off all notifications. I found my e-mail the most distracting and have since decided to check it at my own pace.</li>
<li>Work-life balance? - Here, I fell behind compared to the last update. On two occasions, I was edging on burnout. Things are better now, but I realized I don’t have hobbies outside of robotics. I’m actively working on this and starting to see the light at the end of the tunnel.</li>
<li>In my last update, I mentioned taking on all opportunities that fell into my lap. Well, not anymore. I’ve become more selective and more mindful of my schedule.</li>
</ul>
<figure class="center">
<img src="/images/3_yr_consulting/office_collage.png" alt="Collage of the office" />
<figcaption>Bye-bye workshop.</figcaption>
</figure>
<h3 id="passive-income---here-i-come">Passive Income - Here I Come!!</h3>
<p>Last year, I released a Manning Live Project series on <a href="https://www.manning.com/liveprojectseries/build-mobile-robots-with-ROS2?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski_build_5_6_22&a_aid=mateusz&a_bid=a308c7c4">Building Mobile Robots with ROS 2</a>. Since it’s been released, around 230 students have enrolled in it. This project took me slightly over a man-month of time, invested over six months. Since I’m about to earn my advance, earning a month’s consulting salary from this project will take approximately 19 years. Even though I’m not earning a huge amount from this project, if I could go back in time, I would do it again. It’s been an incredible journey, and I think it’s time to look into publishing more content like this. Lately, I’m finding writing quite rewarding.</p>
<h2 id="weekly-robotics">Weekly Robotics</h2>
<p>Three months after I became a consultant, I started <a href="https://www.weeklyrobotics.com/">Weekly Robotics</a>, a newsletter that sums up the most exciting research, projects, and news I have come across during my industry research. The newsletter is now closing on 12,000 subscribers on two delivery channels.</p>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/5_yr_consulting/wr_features_2022.mp4" type="video/mp4" />
</video>
<figcaption>Header images of all Weekly Robotics issues in 2022.</figcaption>
</figure>
<p>2022 was a terrific year for the newsletter, where we had many advertisers, started the first long-term partnership with AMD, and managed to budget about $10k for its growth and newsletter content.</p>
<p>I’ve put this money towards:</p>
<ul>
<li><a href="https://www.weeklyrobotics.com/weekly-robotics-250">Visiting ICRA this year</a></li>
<li>Getting technical folks to write long-term articles</li>
<li>Doing <a href="https://github.com/orgs/WeeklyRobotics/sponsoring">GitHub sponsorships</a> for some open-source robotics developers</li>
<li>E-mail distribution, hosting, etc.</li>
<li>Subcontracting sponsor outreach</li>
<li>Article editing</li>
</ul>
<p>The newsletter, however, is far from profitable, especially if I’m not spending this time on consulting. Last time I checked, the time spent on this project annually is about one man-month. It also does not bring in many projects (I’ve counted three since I started it). This is understandable, though, as I’m not advertising my consultancy there and prefer it to be separate from my branding.</p>
<p>The best thing about the newsletter now is the community. Our <a href="https://www.patreon.com/WeeklyRobotics">Patreon</a> slack channel has grown quite a bit, especially since I started inviting non-Patreon users. Having a small community to bounce ideas off of and discuss robotics and projects is great!</p>
<h2 id="insights">Insights</h2>
<p>These blog posts are helpful to many (or so I’ve been told), so let’s jump into the meat of it. Technical consulting is not a zero-sum game, so I’m more than comfortable sharing what has worked for me on this journey. I hope some of you will find these points helpful.</p>
<h3 id="my-problem-solving-strategy">My Problem-Solving Strategy</h3>
<p>If you have clients who are no experts in robotics, as a consultant, you can have a big leverage on them. Let’s consider an exaggerated situation where a client wants to move an item from point A to B and they know nothing about automation. To drive the point, let’s say that you either:</p>
<ol>
<li>Propose to develop a ground vehicle with a manipulator arm attached</li>
<li>Tell them the conveyor belt exists and to check them out</li>
</ol>
<p>Approach a) would provide you with lots of work (and money), while approach b) would either mean that you don’t engage in the project or you take the role of an integrator. Maybe I’m a bit too idealistic, but I always use the latter approach and put my customer’s interest first. I think it’s much easier to work on projects you believe in, and you sleep better at night, too!</p>
<h3 id="two-rooms">Two Rooms</h3>
<p>There is a massive overlay between engineering consulting and consulting in the creative fields. In his book, <a href="https://www.expertise.is/">The Business of Expertise</a>, David Baker presents a concept of <a href="https://www.davidcbaker.com/your-creative-firm-is-a-single-building-with-two-rooms">two rooms</a> that I, of course, took over and incorporated into my strategy.</p>
<figure class="center">
<img src="/images/5_yr_consulting/image1.png" alt="Startegy and execution rooms" />
<figcaption>The two-room concept. Source: <a href="https://www.davidcbaker.com/your-creative-firm-is-a-single-building-with-two-rooms">davidcbaker.com</a></figcaption>
</figure>
<p>In my work, I have two hats:</p>
<ul>
<li>Consultant - providing high-impact and high-value advice. Usually, this is quite high-level (strategy in the image above)</li>
<li>Contractor/Freelancer - very hands-on role, implementing hardware and software to solve customer problems (execution in the image above)</li>
</ul>
<p>If you start with a client providing strategy-related work, moving to execution is relatively easy and can be a part of your package. If you begin with execution, it might be tough to start wearing a strategy hat. You’d need a mix of luck, good salesmanship, and someone inside the organization recognizing your value.</p>
<h3 id="shut-up-and-listen">Shut Up and Listen</h3>
<p>When listening to a client, you might feel that you know how to fix their issue and that it is straightforward. Even though the urge to interrupt might be strong, you should not jump to conclusions. Chances are, your client has worked for years in their industry, and they know it way better than you ever will. So strap in, shut down your eagerness to propose solutions, and ask good questions instead.</p>
<blockquote>
<p>Client: … and the UGV we are working on needs to be localized to within 1 meter in the target environment.</p>
<p>You: Easy-peasy, we will take a LiDAR, put some SLAM library on this baby and it’s done.</p>
<p>Client: and the target environment for this client is a mirror factory.</p>
<p>You: …</p>
</blockquote>
<figure class="center">
<img src="/images/5_yr_consulting/image4.png" alt="Shocked Pikachu meme" />
<figcaption>You, after rudely interrupting a client with your solutions, instead of letting them finish describing their problem in detail…
</figcaption>
</figure>
<h3 id="dont-do-the-price-dance">Don’t Do the Price Dance</h3>
<p>Sometimes, you’ll get approached about a project. You will exchange a couple of e-mails, then you’ll take a 30-60 minute unpaid call, then you’ll create a summary of the meeting, and then, you’ll gently start hinting towards talking about money. If you do this, then I hope it works for you. Chances are, you’ve wasted everyone’s time (most importantly, yours).</p>
<p>What I tend to do these days is reply to the first message along the lines of:</p>
<blockquote>
<p>Hi!</p>
<p>Thanks for your message. What you describe sounds like a project I could help with. I charge my clients on an hourly basis at $X. If this works for you, I would love to have an introductory call and take it from there.</p>
</blockquote>
<p>Quite often, this will be the last interaction with a prospective client. Some people will also get offended if you discuss the price too early. ‘Godspeed’ is what we tell these folks and we move on.</p>
<p>The clients worth working with don’t question my hourly rate and respect my expertise, so it’s always worth being upfront about your fees.</p>
<h3 id="charge-by-the-project-if-you-want-to-be-rich">Charge by the Project if You Want to Be Rich.</h3>
<p>Jonathan Stark has <a href="https://medium.com/@jonathanstark/how-i-realized-that-hourly-billing-was-nuts-2aee1fa959b3">a blog post</a> and even a book series on how projects, where you bill by deliverables, are superior to hourly-based ones. Working like this is comfortable for your clients because they know how much they will pay upfront, and for you because you’ll incorporate a hefty premium in your estimate, and it will all work out, right?</p>
<p>Well, maybe. I find it e-n-o-r-m-o-u-s-l-y difficult to price R&D projects because it’s all fine and well until you suddenly need to recompile a Kernel of the onboard computer, or it turns out that a fried electronic component caused a bug you’ve been tracking for three days or the customer has changed requirements. Now you need to explain how this eats into the budget and figure out how to charge extra and get paid.</p>
<p>I think fixed-price projects work if you have a good idea of what you are doing. This means you have experience with most of the hardware the client wants, or you are in a position to choose for them, and you know roughly how to solve the software part, and ideally, you have delivered similar software/hardware in the past.</p>
<h3 id="remote-hardware-work-overhead">Remote Hardware Work Overhead</h3>
<p>Working remotely on hardware (e.g., integrating a LiDAR, IMU, or some SBC) takes 2-3x longer without physical access. Because of that, I usually advise clients to ship hardware to me or get me to their location. For purely remote work, it’s best when the client has a local team that can do the hands-on work and gather data for you, following your instructions.</p>
<h3 id="dont-withhold-expertise">Don’t Withhold Expertise</h3>
<p>Imagine a very theoretical scenario of a conversation between (Y)ou and two consultants (C1, C2) during an introductory call:</p>
<blockquote>
<p>Y: We need help with our localization stack. For some reason, our robot position estimate is not turning in the global frame when it’s rotating just fine in real life.</p>
<p>C1: Oh, that's not ideal. I've set up the EKF on hundreds of platforms. If we work together, I'll be able to fix this.</p>
<p>C2: This is an interesting problem. If I were you, I would check which axes are enabled in the EKF and ensure that the yaw axis of IMU or the angular velocity from wheel odometry is fused. If enabled, I would double-check the covariances in your sensor messages.</p>
</blockquote>
<p>Which consultant would you pick to work with? Using the second approach has been working well for me so far. I would not worry about a hypothetical scenario of a client taking your advice and running away. It will happen occasionally to C1 and C2.</p>
<p>Even though it might be far from reality, C2 can appear more knowledgeable during the call because C1 didn’t showcase their skills. Having some portfolio can help C1, but after the call, C2 already contributed value to the company, and they might be more keen to hire them.</p>
<h3 id="consider-regulations">Consider Regulations</h3>
<p>If you have this grand vision of developing the slickest delivery drone or a new robot for eye surgery, and your vision is very clear (pun intended!), and you want it to be on the market in half a year, I might have some bad news. Some fields are highly regulated, and for a good reason. Regarding drones, you should look at FAA certification in the U.S. or EASA in the EU.</p>
<p>My answer in these situations is usually: “Great, let’s do it. I can help you build the prototype, but let’s get onboard someone who knows the regulations so that we start building with these in mind”.</p>
<p>If you disregard the certification, you might create a prototype that works very well. In your tests, it could deliver the packages, but when you start looking into what is required - you might need to change your autopilot, triggering a rewrite of your software. By the way, we need a parachute now, and we won’t get as much flight time, so we need heavier batteries, but this will end up with us being in an even more controlled certification category… and it goes on.</p>
<p>Doing these rounds is undoubtedly fun, and you learn a lot, but, as a client, do you want to keep throwing money into the pit, instead of investing in a clear roadmap?</p>
<h3 id="be-careful-using-upwork-for-high-level-consultations">Be Careful Using Upwork for High-Level Consultations</h3>
<p>Upwork is quite a good place for finding projects. In exchange for a 10% fee, you get some discoverability and payment protection for hourly projects. The payment protection will only work though if you move your mouse and press keyboard keys enough during your work. I had this issue with one of my clients - having a very high-level consultation that did not require me to do any input on my computer, and it just so happened that the customer’s credit card bounced.</p>
<p>Upwork then looked through my keyboard/mouse activity during the call and zeroed the activity periods without enough action (even though the app takes screenshots of your screen, and I could be seen on a call), removing a big chunk of my consultation time and never paying me for it. All of this is in line with their T&Cs. After the customer has their account revoked, you can manually add ‘lost time,’ but they can say no, and I don’t think there is much you can do about it, so use Upwork with caution</p>
<h2 id="fin">Fin?</h2>
<p>Over the past two years, I’ve worked increasingly with <a href="https://dte.ai/">DTE</a>, to the point where I’m well-embedded with their team.</p>
<figure class="center">
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/sl-AOec0p9w" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen=""></iframe>
<figcaption>DTE promo video
</figcaption>
</figure>
<p>The company develops solutions (hardware and software) for the analysis of the element composition of alloys. I always explain how the system works by shooting lasers into alloys and using spectroscopy on the light emitted by the process to estimate the element content of alloys.</p>
<figure class="center">
<img src="/images/5_yr_consulting/image2.jpg" alt="A man on a volcano" />
<figcaption>Me, scaling a volcano in Iceland during one of my many visits to DTE’s headquarters. </figcaption>
</figure>
<p>As a consultant, I’ve worked with about 50 companies now. There are only three that I would invest in, given a chance. DTE is one of those companies.</p>
<p>Even though I’m on a monthly retainer with one company at the moment, I still consult with others. But being so involved with a company day-to-day, you might be given a choice.</p>
<figure class="center">
<img src="/images/5_yr_consulting/image5.jpg" alt="Red pill blue pill on hands" />
<figcaption>Do you take the leap?</figcaption>
</figure>
<p><a href="https://msadowski.github.io/5-years-remote-robotics-consulting/">Remote Robotics Consulting - A Five-Year Retrospective</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on August 06, 2023.</p>
Robotics https://msadowski.github.io/ros2-mobile-robots-course2022-07-05T00:00:00-00:002022-07-05T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>Earlier this year, I made a beginner’s course <strong><a href="https://www.manning.com/liveprojectseries/build-mobile-robots-with-ROS2?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski_build_5_6_22&a_aid=mateusz&a_bid=a308c7c4">Build Mobile Robots with ROS2</a></strong>, that focuses on mobile robots. This blog post describes my experience and thought process when designing it.</p>
<!-- more -->
<h2 id="introduction">Introduction</h2>
<p>I used ROS for my Master’s Thesis (<em>A 3D Mapping Payload for UAV Applications</em>) in 2013. Then, I started working full-time with it around 2016 and never stopped. My main focus was always on applications using drones and mobile robots, so covering mobile robotics was a natural fit for the whole project series.</p>
<p>Why did I switch from calling it a course to calling it a project? Manning LiveProjects are not traditional online courses. Instead, the students get the materials they need to go through by themselves. Then they complete the assignments themselves, with occasional help from extra resources, mentors or fellow students.</p>
<h2 id="project-series">Project Series</h2>
<p>In the project series, you will be joining RoboPrototypes, a company focusing on developing custom robotics solutions for their customers. Once you complete your first project, you are on board with the company, and you have to deliver Dribot, a mobile robot prototype that will carry drinks in the customer’s food courts in the mall near you.</p>
<figure class="center">
<img src="/images/ros_course/dribot_solid.png" alt="A 3D render of a differential-drive robot" />
<figcaption>A simplified 3D render of a Dribot differential-drive robot</figcaption>
</figure>
<p>Your quirky boss Michael (any potential resemblance to characters from a certain TV show about people working in an office is purely coincidental), who at one point is careless enough to hit your co-worker’s shin <strong>really hard</strong> with a robot, will be closely monitoring your progress.</p>
<p>There are two highlights of my liveProject series that I think can make it interesting for students:</p>
<ol>
<li>The author’s insights build on my expertise working in the past four years as a consultant, dealing with a platitude of robotic platforms (more on this in the next section)</li>
<li>Mentorship - if any student gets stuck in a project, they can reach out on the chat to get help from me</li>
</ol>
<h3 id="series-structure">Series Structure</h3>
<p>I love the modularity that comes with ROS. Thanks to it, I was able to structure the course into the following sections that make logical sense:</p>
<ol>
<li><strong><a href="https://www.manning.com/liveproject/get-started?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski1_build_5_6_22&a_aid=mateusz&a_bid=2f7b11df">ROS2 - Getting Started</a></strong> - where we follow the <a href="https://docs.ros.org/en/galactic/Tutorials.html">official tutorials</a> and create some simple launchfiles and nodes</li>
<li><strong><a href="https://www.manning.com/liveproject/simulate-a-robot?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski2_build_5_6_22&a_aid=mateusz&a_bid=209e1b6a">ROS2 - Simulate a Robot</a></strong> - where we create a robot description and simulate a differential-drive robot in Gazebo Classic</li>
<li><strong><a href="https://www.manning.com/liveproject/sensors-and-sensor-fusion?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski3_build_5_6_22&a_aid=mateusz&a_bid=5ede56ff">ROS2 - Sensors and Sensor Fusion</a></strong> - where we add a Camera, LiDAR, and IMU and implement <a href="https://index.ros.org/p/robot_localization/#galactic">robot_localization</a> for sensor fusion</li>
<li><strong><a href="https://www.manning.com/liveproject/simultaneous-localization-and-mapping?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski4_build_5_6_22&a_aid=mateusz&a_bid=877bbdf0">ROS2 - Simultaneous Localization and Mapping</a></strong> - where we use <a href="https://github.com/SteveMacenski/slam_toolbox">Slam Toolbox</a> for, well, SLAM</li>
<li><strong><a href="https://www.manning.com/liveproject/navigation?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski5_build_5_6_22&a_aid=mateusz&a_bid=d038e3ab">ROS2 - Navigation</a></strong> - where we use <a href="https://navigation.ros.org/">Nav2</a> for navigation in the <a href="https://github.com/aws-robotics/aws-robomaker-small-house-world">AWS RoboMaker Small House World</a></li>
</ol>
<p>With the above projects, each consisting of three to five milestones, the work should be divided into manageable tasks. My idea at the time was that each of the milestones should take at most one evening to implement, and I hope I achieved this goal.</p>
<h3 id="resources">Resources</h3>
<blockquote>
<p>A project is as good as the resources contained within</p>
</blockquote>
<p style="text-align:right">Me, right about now</p>
<p>Since the idea behind the LiveProjects is to provide resources to students so that they can complete the project themselves, it’s essential to find quality resources. Fortunately, right before I started working on the series, Andreas Bihlmaier began working on his book <a href="https://www.manning.com/books/robotics-for-software-engineers?utm_source=mateusz&utm_medium=affiliate&utm_campaign=book_bihlmaier_robotics_1_28_22&a_aid=mateusz&a_bid=61d75361">Robotics for Software Engineers</a>. Andreas does a fantastic job explaining robotics concepts, I mean, take a look at the image below:</p>
<figure class="center">
<img src="/images/ros_course/robot_systems.png" alt="Graphics showing common parts between all types of robots" />
<figcaption>Sense. Plan, Act in various robotic systems. Source: <a href="https://www.manning.com/books/robotics-for-software-engineers?utm_source=mateusz&utm_medium=affiliate&utm_campaign=book_bihlmaier_robotics_1_28_22&a_aid=mateusz&a_bid=61d75361">Robotics for Software Engineers</a>, Chapter 1 by Andreas Bihlmaier</figcaption>
</figure>
<p>Since both of our projects are with Manning, I can link to Robotics for Software Engineers throughout my liveProject series. Additionally, students get a couple of months of access to the book when they purchase a liveProject</p>
<p>On rare occasions, I could not find materials that would explain some concepts easily enough. In such cases, I wrote the introductions myself or created a <a href="https://msadowski.github.io/basic-sensors-for-mobile-robots/">dedicated blog post</a> in one case.</p>
<h2 id="author-insights">Author Insights</h2>
<p>As I mentioned earlier, the author’s insights are my favourite bits since they allow me to share some experiences from the field. Here is a sample from one of the milestones in the <a href="https://www.manning.com/liveproject/sensors-and-sensor-fusion?utm_source=mateusz&utm_medium=affiliate&utm_campaign=liveproject_sadowski3_build_5_6_22&a_aid=mateusz&a_bid=5ede56ff">Sensors and Sensor Fusion</a> liveProject:</p>
<div class="notice">
<p>
Almost every client I have had that was working with mobile robots had some issue with an IMU. These insights will be quite long, but building intuition about IMUs will help you build quality robots. I will have some insight to offer in the next milestone as well.
</p>
<h4>Transformation frames</h4>
<p>If you’ve gone through the previous project (Simulate a Robot), you will remember we discussed coordinate frames. You might remember that for a typical robot platform, you would expect to have an x-axis pointing forward, y-axis pointing left, and z-axis pointing up.</p>
<p>The thing with sensors is that they will always report data in their local coordinate frame, so you might find an IMU that, when in a default position its z-axis is pointing down. (Let’s say that the x-axis is still pointing forward; this means that the y-axis points to the right).</p>
<p>The first thing we need to take care of is to make sure that our <code class="highlighter-rouge">imu_link</code> has the same orientation as the orientation of the IMU coordinate frame (in this case rolled 180 degrees around our robot’s x-axis).</p>
<p>This situation will get tricky when you start analyzing raw data. Since your IMU data will be referenced in the IMU frame, this means that if your robot rotates counter-clockwise it will report a positive angular velocity around the z-axis, but your IMU will report a negative velocity! This is totally fine as long as the imu_link correctly describes the orientation of the IMU coordinate system.</p>
<p>Any system that takes in this velocity will need to look up the coordinate transform between the <code class="highlighter-rouge">base_link</code> and the <code class="highlighter-rouge">imu_link</code>.</p>
<h4>Vibration</h4>
<p>A good rule of thumb is to isolate the vibrations physically. Ideally, you should put a rubber pad or thick double sided tape between the IMU and your frame to dampen the vibrations between the frame and the IMU.</p>
<p>Even though you could filter the IMU data in software, I would recommend always trying to solve the problem in the physical domain first whenever dealing with sensors or actuators and only then start looking into the issue through software.</p>
<p>More insights on IMUs are coming soon in the following milestones!</p>
</div>
<h2 id="statistics">Statistics</h2>
<p>Here are some statistics on the project series that you might find interesting:</p>
<ul>
<li>Months from project start to release: <strong>7</strong></li>
<li>Working hours to deliver liveProject: <strong>177.5 hours</strong></li>
<li>Working hours spent on project mentorship so far: <strong>9 hours</strong></li>
<li>Total number of text for the whole project series: <strong>34,019 words</strong> OR <strong>234,120 characters</strong> OR <strong>98 A4 pages</strong> (font size 10)</li>
<li>Total numbers of commits in my project repository: <strong>146</strong></li>
<li>Number of certificates granted so far: <strong>2</strong></li>
<li>Number of students that signed for the free project so far: <strong>258</strong></li>
<li>Number of students that signed up for the follow-up projects: <strong>97</strong></li>
</ul>
<h2 id="the-difficulties">The Difficulties</h2>
<p>One thing I did not figure out in this project series, and I would be super grateful if anyone has any ideas for a clean implementation, is handling a consistent code base along multiple projects. We are building up on the same code base from the second project until the end. Each of the milestones adds a bit of new code or configuration, and when I started, I was copying the source code between milestones and appropriately providing partial or full solutions for each of them.</p>
<p>The above means that if there is a bug in the first milestone of the second project, I would need to manually copy the fix over to all the other milestones (I did it twice already). Perhaps a cleaner solution would be to use a branch per project milestone, but this way, I would need to apply any fixes to up to 16 branches, not really reducing the complexity of this operation.</p>
<p>If you have any thoughts on this, please let me know. If you would like to create a project like this, I suggest you have a good think about this issue before you start.</p>
<h2 id="outro">Outro</h2>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/ros_course/dribot_nav_through.mp4" type="video/mp4" />
</video>
<figcaption>Dribot navigating through waypoints in a simulated apartment</figcaption>
</figure>
<p>This project was quite a ride. The main challenge for me was consulting close to full-time and then also working on this series. With the help of Manning Staff, outstanding Reviewers and Implementers, I delivered what I think is a solid ROS2 project series for beginners who would like to learn how to use ROS2 for mobile robots.</p>
<p>Updating the whole series to work with Gazebo Fortress and ROS2 Humble is still on my plate. Hopefully, I will be able to deliver this summer.</p>
<p>That is all that I have for this update. Happy roboting!</p>
<p><a href="https://msadowski.github.io/ros2-mobile-robots-course/">I Created a Project-based ROS2 Course</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on July 05, 2022.</p>
Robotics ROS https://msadowski.github.io/basic-sensors-for-mobile-robots2022-02-24T00:00:00-00:002022-02-24T00:00:00+01:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>While preparing my ROS2 course on mobile robots I was not able to find quality information about basic sensors that can be used for mobile robots and the most important considerations on using them. In this short blog post, I’ll discuss some of the issues I have come across while using IMUs and LiDARs in tens of robotics projects.</p>
<!-- more -->
<h2 id="imus">IMUs</h2>
<p>If you are working in mobile robotics it’s pretty much given that you will use an IMU on your robot. Over the years I’ve worked with multiple sensors such as:</p>
<ul>
<li><a href="https://www.sparkfun.com/products/11028">MPU6050</a></li>
<li><a href="https://www.bosch-sensortec.com/products/smart-sensors/bno055/">Bosch BNO055</a></li>
<li><a href="https://www.phidgets.com/?tier=3&catid=10&pcid=8&prodid=1204">PhidgetSpatial Precision</a></li>
<li><a href="https://www.xsens.com/products/mti-600-series">Xsens MTi-630-DK</a> - last used for my blog post on <a href="https://msadowski.github.io/3d-mapping-with-ros/">3D Mapping</a></li>
</ul>
<p>When choosing a sensor, your number one consideration will be most likely price, if you are on the market for some sensors the IMUs above will most likely introduce you to the whole range of prices. Please don’t consider the list above as advice, as everything depends on the fine details of your project. Most likely you will want to take a look at such features as:</p>
<ul>
<li>Update rate - for wheeled mobile robots ~50Hz is enough, for UAVs 100-500Hz should do depending on aircraft type</li>
<li>Whether the attitude fusion happens onboard the sensor</li>
<li>Available interfaces - USB is probably the easiest if you are thinking of using ROS</li>
<li>Resolution, maximum readings, gyro drift etc.</li>
<li>Calibration requirements - more on this in the next section</li>
</ul>
<h3 id="imu-gotchas">IMU gotchas</h3>
<p>In my experience so far, IMUs are <strong>the sensors</strong> causing the most trouble when integrating them on mobile robots. Let’s go through some common pitfalls one by one:</p>
<p><strong>1. Magnetic interferences</strong></p>
<p>In my first job, we were integrating a drone autopilot with an onboard IMU on a helicopter platform. After many days of trial and error, we were not able to tune the heading controller that would make the helicopter keep a fixed heading while in the air. Only by accident, we’ve noticed lots of noise in the heading component in the IMU data and then when we were landing the aircraft we’ve noticed that, actually, these magnetometer errors have a period and it’s tied to the rotation speed of the main rotor! We have not seen this behaviour in the smaller model, which lead us to the discovery that in the model we had used ferrous metal rods in the blades! That would explain the noise.</p>
<figure class="center">
<img src="/images/applied_sensors/RH80E10XW_F_01.jpg" alt="Image of TREX 800E helicopter" />
<figcaption>Rough dimensions of the helicopter in question (not the exact model). Source: <a href="https://www.align.com.tw/index.php/helicopter-en/trex800/">align.com.tw</a></figcaption>
</figure>
<p><strong>HINT:</strong> If you are using a magnetometer in your robot - always move it as far away as possible from sources of the magnetic field. The most common source would be moving ferrous metals, actuators (magnets!) and high-current wires.</p>
<p><strong>2. Vibration</strong></p>
<p>Your robot most likely vibrates. The amplitude of the vibrations will depend on materials used, type of terrain and the drive train (among many other factors). In drones, we had a rule that when you plot all acceleration axes, and X or Y axis measurements are touching the Z-axis measurements during hover then your vibration levels are too high.</p>
<figure class="center">
<img src="/images/applied_sensors/vibrations_s500_accel.8df61160.png" alt="Image of TREX 800E helicopter" />
<figcaption>Example of vibrations that are definitely too high. Source: <a href="https://docs.px4.io/master/en/log/flight_review.html">docs.px4.io</a></figcaption>
</figure>
<p>Transferring these rules onto mobile robotics:</p>
<p><strong>HINT:</strong> If your robot drives on a flat planar surface and you see acceleration readings overlap between X/Y and Z axes then you definitely need to dampen the vibrations.</p>
<p>To dampen these kinds of vibrations you could use rubber offsets or thick double-sided tape.</p>
<p><strong>3. Axes definition</strong></p>
<p>This point is highly specific to software, but many robotic systems I’ve been working on had issues with wrongly defined IMU axes at one point or another.</p>
<p><strong>HINT:</strong> Always double-check the IMU axes defined in your software vs. the axes of your physical device.</p>
<p><strong>4. Calibration</strong></p>
<p>Some IMUs will require you to perform calibration (usually rotating the sensor in some way in all-axes), however, some IMUs will not store this information (looking at you <a href="https://learn.adafruit.com/adafruit-bno055-absolute-orientation-sensor/device-calibration">BNO055</a>). Once your IMU is mounted on a target system, you might not be able to rotate it along all axes, so you need to make sure this will not cause you some issues.</p>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/applied_sensors/roll.mp4" type="video/mp4" />
</video>
<figcaption>3D IMU calibration after you've fitted it on the car. Source: <a href="https://www.youtube.com/watch?v=iVRmFQixqsc">YouTube (Top Gear)</a>
</figcaption>
</figure>
<h2 id="lidars">LiDARs</h2>
<p>If you are reading this article from top to bottom (why wouldn’t you?), then here starts the fun! With LiDARs we are entering a space of ‘smart robotics’, where our robots can start reasoning around their environments and can know where they are.</p>
<p>Without going into too much detail (<a href="https://en.wikipedia.org/wiki/Lidar">Wikipedia has got you covered for this</a>), with LiDARs we are talking about a light pulse is emitted and we calculate the time it takes for it to come back (in some cases we might look at the phase shift of light too).</p>
<p>When we talk about LiDARs on the market we will find the following types:</p>
<ul>
<li>Single point distance sensors (e.g. <a href="https://www.terabee.com/shop/lidar-tof-range-finders/teraranger-evo-15m/">Terabee TeraRanger Evo</a>, <a href="https://lightwarelidar.com/collections/lidar-rangefinders/products/sf11-c-100-m">Lightware SF11/C</a>) - provide single point measurements</li>
<li>2-D scanners (e.g. <a href="https://www.slamtec.com/en/Lidar/A1">RPLidar A1</a>, <a href="https://www.hokuyo-aut.jp/search/single.php?serial=169">Hokuyo UTM-30LX</a>) - provide single plane measurements</li>
<li>3-D scanners (e.g. <a href="https://ouster.com/products/scanning-lidar/os0-sensor/">Ouster</a>, <a href="https://velodynelidar.com/products/puck-lite/">Velodyne Puck</a>) - provide measurements in multiple planes</li>
<li>Scanners with non-repeating patterns (e.g. <a href="https://www.livoxtech.com/mid-40-and-mid-100">Livox Mid-40</a>) - provide measurements as a non-repeating pattern</li>
</ul>
<p>The choice of the sensor should depend on your application, interfaces you want to support, the processing power available etc.</p>
<h3 id="lidar-gotchas">LiDAR gotchas</h3>
<p>Before I worked with LiDARs I didn’t know that light can be so annoying! Here are some things you might (but hopefully won’t) come across.</p>
<p><strong>1. Eye fatigue</strong></p>
<p>Now, I don’t have any evidence for this, but on numerous occasions, I have experienced eye fatigue while working with LiDARs. It feels like your eyes are a bit numb, my colleague describe the feeling as ‘pain, but without the pain component’. Technically, most LiDARs you will come across will be eye-safe, but I would still recommend not keeping them at your eye level. If anyone knows any research about long term issues that could result from these, please feel free to let me know.</p>
<figure class="center">
<img src="/images/applied_sensors/bart.png" alt="Blind Bart" />
<figcaption>Me, the first time setting up the robot for half a day and keeping the LiDAR at my eye level</figcaption>
</figure>
<p><strong>2. ‘Flip effect’</strong></p>
<p>This is one of the issues that you will rarely come across but if you do it will be a huge pain in the ass to rectify. I’m actually not aware of anything you can reliably do to filter this kind of data in all cases. What happens here, is that the sensor reports a way smaller distance than it should, as shown in the picture below:</p>
<figure class="center">
<img src="/images/applied_sensors/flip_effect.png" alt="Sensor ranges when flip effect occurs" />
<figcaption>The 'flip effect' illustrated</figcaption>
</figure>
<p>The real distance of the sensor is red + green (expected distance), but the sensor report just the green distance. The only way to fix this problem that I know of is to update sensor firmware.</p>
<p><strong>3. Field of view shenanigans</strong></p>
<p>Your LiDAR emitter will have a certain field of view (most likely a circular one, about 1-3 degrees). Looking at the picture below, there is a sensor with a field of view, and an object that is only partially in view. What distance would you expect the sensor to provide?</p>
<figure class="center">
<img src="/images/applied_sensors/sensor_fov_averaging.png" alt="Sensor field of view with a wall and an object in between the wall and the sensor" />
<figcaption>Sensor and it's field of view</figcaption>
</figure>
<p>The answer is: it depends on your sensor! Some will provide the first return (Y), some will provide both Y and X (that’s how you can <a href="https://www.researchgate.net/figure/Example-LiDAR-point-clouds-of-a-canopy-tree-located-in-Permanent-Sample-Plot-PSP-14_fig3_341735422">map tree canopies with LiDAR</a>) and some will average all distances within the field of view providing the result Y < Result < X.</p>
<p>Whether any of these will be a problem for you depends entirely on your application.</p>
<p><strong>4. Maximum distance vs. environment</strong></p>
<p>The datasheets of LiDARs will usually specify the maximum distance, but you will not necessarily see the same maximum distance in your application. Usually, the specs provided are for targets with some defined reflectivity. If you direct your sensor at a black target (low reflectivity), you probably won’t be able to reach the maximum distance shown in the datasheet.</p>
<p>Another issue that will hamper your measurements are outside conditions. The chart below will show you which wavelengths are overly present in solar radiation.</p>
<figure class="center">
<img src="/images/applied_sensors/Solar_Spectrum.png" alt="Sensor field of view with a wall and an object in between the wall and the sensor" />
<figcaption>Solar radiation spectrum. Source: <a href="https://commons.wikimedia.org/wiki/File:Solar_Spectrum.png">Wikimedia Commons</a> </figcaption>
</figure>
<p>Most likely, the sensor of your choice will fall into one of the valleys in the chart. If the manufacturer of your sensor is not providing the wavelength of the sensor. consider it to be a red flag. Combining everything we’ve covered in these subsections, you might learn that having a sensor operate in full sun, looking at tall grass might yield the maximum range that will be suboptimal for your application.</p>
<h2 id="outro">Outro</h2>
<p>This about sums it up, when it comes to some pitfalls I’ve run into when working with LiDARs and IMUs. Robotics being robotics there are hundreds of other things that can go wrong in your setup, hopefully with some patience and good detective work you’ll be able to get to the bottom of issues you come across in your setups.</p>
<p>Now, let’s make some robots!</p>
<figure class="center">
<img src="/images/applied_sensors/turtle_wr.jpeg" alt="Turtlebot wearing T-shirt 'I make robots'" />
<figcaption>Turtlebot wearing a <a href="https://shop.weeklyrobotics.com/">Weekly Robotics T-shirt</a></figcaption>
</figure>
<p><a href="https://msadowski.github.io/basic-sensors-for-mobile-robots/">IMUs and LiDARs - Not Uncommon Pitfalls</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on February 24, 2022.</p>
Robotics https://msadowski.github.io/3-5-years-remote-consulting2021-10-31T00:00:00-00:002021-10-31T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>This is the third blog post in the series of posts summing up my experience working as a Robotics Consultant. My recent realisation is that I’ve been consulting for half my career now, so hopefully some of the things I’ve learnt along the way will help someone else out there.</p>
<!-- more -->
<h2 id="previous-blog-posts">Previous Blog Posts</h2>
<p>If this update is your entry point to my reflections on working as a remote Robotics Consultant, then you might want to start with my previous blog posts: <a href="https://msadowski.github.io/one-year-of-robotics-consulting/">1 Year In</a>, <a href="https://msadowski.github.io/2-years-remote-consulting/">2 Years In</a>. I’ll do my best not to talk about the same things, but instead focus on the things that have either changed or that I’ve developed new thoughts on.</p>
<h2 id="high-level-business-update">High Level Business Update</h2>
<p>Let’s start with the client maps, as I’ve had 10 new clients since the last update:</p>
<figure class="center">
<img src="/images/3_yr_consulting/worldwide_clients.jpg" alt="Map with 32 Clients I had all over the world" />
<figcaption>Clients I've worked with since starting</figcaption>
</figure>
<figure class="center">
<img src="/images/3_yr_consulting/eu_clients.jpg" alt="Image showing 13 of my clients in Europe" />
<figcaption>Clients I've worked with since starting - EU edition</figcaption>
</figure>
<p>When it comes to clients, there have been some significant changes to my approach since the last update. I have two different rates for clients now. Simply put, for short-term engagement work, I charge more than for long-term clients. I’ve found that this approach has helped a lot with my work-life balance and that the premium for short-term consultations adds some motivation to fit them outside my standard working hours.</p>
<p>In the previous update, I mentioned that I’ve started evaluating EU funding proposals to make sure your tax money is spent well. I’m still doing that and so far I’ve been involved in five evaluations. Even though these evaluations don’t pay nearly as good as consulting, they are a very good business exercise. On top of this, ensuring that tax money is spent on quality projects that can really elevate companies makes me feel good!</p>
<h2 id="pandemic-and-health">Pandemic and Health</h2>
<p>I’ve been lucky that the COVID-19 pandemic did not affect my projects and so I’ve maintained a steady stream of work since I started consulting. I also didn’t catch COVID, which was a plus! The way COVID did affect me, however, was that I was days away from burning out. This is because I took on too many projects than I could handle, and got bored out of my mind with all the social restrictions and locked gyms.</p>
<p>Then, once everything started opening up again, I continued overshooting:</p>
<ul>
<li>I kept the same number of projects (often worked on weekends)</li>
<li>I started going to the gym</li>
<li>I continued running</li>
</ul>
<p>I got to a point where I wanted to make up all the time away from the gym and so I was going to the gym three times a week, and running in the days between. After about 2-3 weeks of this, my body seemed to have had enough and I lost all energy for about three weeks; I had issues with getting enough sleep and as a result, I was not really capable of doing any exercise.</p>
<p>As I’m writing this post, I think I’ve found a balance with going to the gym three times a week and running at the gym for a short period of time. The risk of burnout also seemed to have gone away with the realisations described in the next section.</p>
<h2 id="working-with-clients">Working with Clients</h2>
<p>I’ve found that a switch to remote working for many offices due to the pandemic actually made me more ingrained within the teams I have worked with. Usually, the teams I work with are within the same building, but with everyone switching to remote working, we were all constrained to the same means of communications.</p>
<p>Another thought I have about remote work is that in one of my previous posts I’ve mentioned that it’s hard to work remotely with hardware. I still hold that opinion, but I’ve found that with experience, it’s less and less time consuming, as long as the team ‘on the ground’ can test my updates and provide me with feedback.</p>
<p>When I was writing my previous updates, around 90-95% of my projects were from Upwork. Sometime at the beginning of 2021, there was a big reversal, and nowadays more people reach out to me directly, than through Upwork. This works quite well most of the time, as I don’t have to pay 20% platform fees for short projects. What I miss though is the payment security. So far I’ve been sending out invoices after the consultations, even for new clients, but as some customers have been late with their payments, I’ll probably need to look into being paid upfront for the first few engagements. The risk is manageable for short-term consultations, but investing one month into the work, and then stressing out about the customer being late on payment is something I would not recommend.</p>
<figure class="half center">
<img src="/images/360_camera/robosynthesis_robots.jpg" alt="Robosynthesis Robots" />
<img src="/images/3_yr_consulting/su_1.jpg" alt="One of the SimpleUnmanned Autonomous ROS Boats with Ouster LiDAR" />
<img src="/images/3_yr_consulting/su_2.jpg" alt="Another SimpleUnmanned Autonomous ROS Boat with a small LiDAR" />
<figcaption>Some of the robots I contributed to in the past year and a bit</figcaption>
</figure>
<p>When it comes to consulting projects in the past 1.5 years, I focused on the following:</p>
<ul>
<li>A modular mobile robot with <a href="https://www.ross-robotics.co.uk/">Ross Robotics</a>.</li>
<li>An unmanned boat for automatic scanning bodies of water with <a href="https://simpleunmanned.com/">Simple Unmanned</a> < a nice dive into bathymetry (pun intended).</li>
<li>An industrial robotics project that I won’t share any details on at this time.</li>
<li>Multiple 1 hour-long consultations on SLAM, LiDARs, mobile robots.</li>
<li>The EU evaluation projects I mentioned previously.</li>
</ul>
<h2 id="time-management">Time Management</h2>
<p>In my previous ‘2 Years In’ post, I mentioned the negative aspects of context switching, how I was trying to avoid it and how I would do it, at most, 2-3 times a day. In the past three months or so, I have taken it to yet another level and started booking ‘customer days’. This fully eliminates context switching, and if I’ve agreed with a customer to be available on Mondays, Wednesday and Thursdays, then they are assured I’ll be reachable. If I have any extra consultations, they come after-hours.</p>
<p>There is one more realisation I had about time this year: I can’t work more than six hours a day on tasks requiring a high level of focus. This will vary slightly depending on the task, but I’ve noticed that accepting this has resulted in the following for me:</p>
<ul>
<li>Feeling way better with my work-life balance.</li>
<li>Taking less projects -> feeling less stressed (consultants who went through the conjunction of deliverables for 3 projects at the same time will know what I’m talking about).</li>
<li>Producing higher quality results and thus being more cost efficient to clients.</li>
</ul>
<h2 id="weekly-robotics">Weekly Robotics</h2>
<figure class="center half">
<img src="/images/3_yr_consulting/WEEKLY ROBOTICS_v1_MEDIUM.png" alt="Weekly Robotics logo" />
<figcaption>Weekly Robotics Logo</figcaption>
</figure>
<p><a href="https://weeklyrobotics.com/">Weekly Robotics</a>, the newsletter that I’ve been maintaining since August 2018, is still my favourite project. Thinking of it, it’s crazy that I haven’t missed a single issue since I started this project. So far this year I’ve clocked 170 hours working on this project, and right now we are sitting at 2,700 e-mail subscribers.</p>
<figure class="center">
<img src="/images/3_yr_consulting/mc_subs.png" alt="Graph of Weekly Robotics subscribers" />
<figcaption>Subscription growth for Weekly Robotics</figcaption>
</figure>
<p>The biggest introduction to the newsletter this year was the <a href="https://weeklyrobotics.com/weekly-robotics-148">meetups</a>. In the first season, we had 12 presenters on all kinds of topics related to our industry. We have restarted the series this fall and I’m really looking forward to keep on learning! If you would like to join any of our meetups, then you can either subscribe to the newsletter, or follow us on <a href="https://www.eventbrite.co.uk/o/weekly-robotics-32664029797">Eventbrite</a>.</p>
<p>The only recurring funding I get for the newsletter is the <a href="https://www.patreon.com/WeeklyRobotics">Patreon</a>, where we have 21 patrons. Occasionally, I post referral links to the Humble Books Bundle (just like <a href="https://www.humblebundle.com/?partner=weeklyrobotics">this one</a>) related to technology. If a reader decides to purchase the bundle, they can choose to give a portion of the price to the newsletter. So far, most of the sponsored posts were me bartering the space in the newsletter for discounts/media sponsorship/robotic hardware.</p>
<p>I don’t think the newsletter will ever come close to supplying as much profit as consulting does, given that I spend 3-6 hours a week on it, but I think the newsletter is a great investment for the following reasons:</p>
<ul>
<li>I have a good motivation to stay up-to-date on the industry.</li>
<li>I love that I have a chance to discuss thoughts and ideas with industry experts on the WR Slack; having some sense of community around the project feels good.</li>
<li>It helps a lot with discussions with potential clients, as it shows I’m consistent.</li>
</ul>
<h2 id="life-updates">Life Updates</h2>
<p>I’m not sure if anyone cares, but in September 2020, I moved to Prague, rented a workshop and proceeded to take things to the next level. City-wise, I love it! There are so many activities to choose from, nice parks and good beer, but I have to say, I miss the French mountains quite a bit.</p>
<figure class="center">
<img src="/images/3_yr_consulting/office_collage.png" alt="Collage of the office" />
<figcaption>The office</figcaption>
</figure>
<figure class="center">
<img src="/images/3_yr_consulting/office_pano.jpeg" alt="Panorama of the office" />
<figcaption>Panorama of the office</figcaption>
</figure>
<p>I still have a long way to go with the workshop, but having so much space (45 square meters to be exact) for activities is great. I also discovered that I’m much more productive in the office than at home. Having a clear separation between work and home life seriously boosts my productivity. Another plus is that it’s much easier to keep the rooms clean without having so many tools lying around.</p>
<h2 id="future-plans">Future Plans</h2>
<p>I plan to keep on the current track for the foreseeable future. I will definitely keep acting on my ROS2 <a href="https://en.wikipedia.org/wiki/Fear_of_missing_out">FOMO</a>, as I quite enjoy it so far - more news on that in about six months!</p>
<p>Even though I’ve learnt to say no to the robotics projects, I still can’t seem to say no to more exotic opportunities. This usually ends up as follows:</p>
<blockquote>
<p>Of course I will be a technical reviewer for your robotics book.</p>
</blockquote>
<blockquote>
<p>You want me to develop a ROS course? Say no more!</p>
</blockquote>
<blockquote>
<p>I’m supposed to be an influencer to your brand on LinkedIn? Let’s do it!</p>
</blockquote>
<blockquote>
<p>Yes, I will join your Board of Directors (this hasn’t happened yet, but I have a strange feeling this will be my answer).</p>
</blockquote>
<p>I think these are all the thoughts that I have to share this time. Thanks for reading and keep on roboting!</p>
<p><a href="https://msadowski.github.io/3-5-years-remote-consulting/">Remote Robotics Consulting - 3.5 Years In</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on October 31, 2021.</p>
Robotics https://msadowski.github.io/3d-mapping-with-ros2021-06-28T00:00:00-00:002021-06-28T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>In my <a href="https://msadowski.github.io/rtk-plug-n-play-with-septentrio/">previous post</a> I’ve shared some of my experience working with Septentrio Mosaic X-5 dev kits. After seeing some good results with that hardware I’ve decided to try to use them with some other hardware to realise a prototype that I had in mind for quite some years. In this post I’m going to describe what I did to get a first prototype working and share some tips for anyone who would like to replicate them.</p>
<!-- more -->
<h2 id="the-idea">The idea</h2>
<p>I’ve been interested in developing a proof of concept system for mapping (either using SLAM or other methods) for at least 7 years now (it might be slightly related to the fact that I was developing something similar for my master thesis). Since I’ve had a chance to work with RTK GNSS units (firstly with <a href="https://msadowski.github.io/ardusimple-rtk-moving-forward/">ArduSimple</a> and then with <a href="https://msadowski.github.io/rtk-plug-n-play-with-septentrio/">Septentrio’s solutions</a>) the logical next step was to start integrating more sensors. Before we get into the nitty-gritty details and the final demo I run let’s take a look at the first prototype of the idea:</p>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/3d_mapping/room_scanning.mp4" type="video/mp4" />
</video>
<figcaption> Indoor scanning
</figcaption>
</figure>
<h2 id="the-design">The design</h2>
<p>Here is the list of hardware that I had access to for this project:</p>
<ul>
<li>2x <a href="https://shop.septentrio.com/en/shop/mosaic-h-gnss-heading-module-evaluation-kit-2-gnss-antennae">Septentrio Mosaic X-5 dev kit</a> - precise GNSS modules in RTK setup</li>
<li><a href="https://www.xsens.com/products/mti-600-series">Xsens MTi-630-DK</a> - AHRS (a precise magnetometer was a must-have for the prototype)</li>
<li><a href="https://www.livoxtech.com/mid-40-and-mid-100">Livox Mid-40 LiDAR</a></li>
<li>Two Mikrotik routers</li>
</ul>
<h3 id="sensors">Sensors</h3>
<h4 id="livox-mid-40">Livox Mid-40</h4>
<figure class="center">
<img src="/images/livox/livox_mid40.jpg" alt="Livox Mid-40" />
<figcaption>Livox Mid-40 LiDAR</figcaption>
</figure>
<p>I’ve received Mid-40 from Livox over 2 years ago and described my initial thoughts in <a href="https://msadowski.github.io/livox-mid40-review/">this blog post</a>. Having a neat scanning pattern was an interesting feature for the prototype. With a single plane LiDAR, I would have to move the prototype quite a bit to gather data, with Mid-40 I could instantly tell if the data was good because I would immediately see the features in the scans.</p>
<h4 id="xsens-mti-630-dk">Xsens MTi-630-DK</h4>
<figure class="center">
<img src="/images/3d_mapping/xsens.jpg" alt="Xsens AHRS" />
<figcaption>MTi-630-DK mounted on the prototype frame</figcaption>
</figure>
<p><a href="https://www.xsens.com/">Xsens</a> was very kind to lend me an evaluation unit of MTi-630-DK for my demonstrator, even though the initial phase of the project took me way longer than anticipated. The MTi-630 is an Attitude and Heading Reference System (AHRS) as it runs on-board sensor fusion algorithms that estimate absolute 3D orientation, in this blog post I’m using IMU and AHRS interchangeably. The main feature I was after in the IMU/AHRS was as precise globally referenced heading as I could get. The better the AHRS the less angular drift I would expect to see in my data.</p>
<p>In the video I’ve shared in the previous section you can see me testing the first prototype in a room running just the MTi and the Mid-40. I used double sided-tape to attach the unit to the top of the LiDAR, created a simple URDF file with correct offset between frames and voila! By slowly rotating the LiDAR in place I was able to scan the room. Let’s take a look at it again:</p>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/3d_mapping/room_scanning.mp4" type="video/mp4" />
</video>
<figcaption> Indoor scanning
</figcaption>
</figure>
<p>Do you see an issue with the 3D map being created? As the LiDAR is turning left you can notice that the new points appear shifted with regard to the points that were initially captured.</p>
<figure class="center">
<img src="/images/3d_mapping/lidar_offset.png" alt="Livox Mid-40" />
<figcaption>Bad map!</figcaption>
</figure>
<p>In my hastily prepared prototype, I assumed that the Mid-40 origin is in the centre of the LiDAR, while according to the docs the origin of the LiDAR is flushed with the front face of the unit. Oh well, it’ll be fixed in the next iteration that will include the next sensor.</p>
<p>And if you are thinking now “hey, Mat, won’t placing the sensor so close to some moving parts, that are undoubtedly magnetic and surely inside the LiDAR, affect the sensor readings?” then you are right. Even though I couldn’t see it when looking at RVIZ data, much later I noticed that turning on LiDAR affected the magnetometer reading that was fused by the AHRS for the orientation estimation.</p>
<h4 id="septentrio-mosaic-x-5">Septentrio Mosaic X-5</h4>
<figure class="center">
<img src="/images/septentrio/mosaic_dev_kit.jpeg" alt="Livox Mid-40" />
<figcaption>Septentrio Mosaic X-5 Dev Kit</figcaption>
</figure>
<p>I’ve used Septentrio dev kits in exactly the same way that I’ve described in the <a href="https://msadowski.github.io/rtk-plug-n-play-with-septentrio/">previous blog post</a>. Just as a quick recap: one unit was functioning as a base station, sending corrections to the rover unit that I’ve mounted on the prototype. All exchanging information through WiFi.</p>
<p>Having very precise location information is the core problem to solve for this prototype. Another approach that could be used would be to try to run SLAM, but this would require more processing power and most likely a LiDAR with a more repeatable pattern and higher field of view.</p>
<h2 id="putting-it-all-together">Putting it all together</h2>
<h3 id="hardware">Hardware</h3>
<p>Here is how I connected everything up when it comes to hardware:</p>
<figure class="center">
<img src="/images/3d_mapping/architecture.jpg" alt="Hardware diagram" />
<figcaption>Simplified hardware architecture of the system</figcaption>
</figure>
<p>I’ve mounted all the components on a frame built from 2020 V-slot profiles, making sure I have enough room for anything I might want to add in the future:</p>
<figure class="center">
<img src="/images/3d_mapping/prototype.jpg" alt="2020 profile based prototype" />
<figcaption>First prototype (before moving IMU to the bottom frame)</figcaption>
</figure>
<p>The LiDAR and the Router were both powered off a single 3S LiPO battery, while the GNSS unit and the IMU was powered from my laptop’s USB ports.</p>
<p>There are two issues with this setup:</p>
<ol>
<li>I planned to use my laptop for processing and could not be bothered setting up a smaller processing unit</li>
<li>The unit weighs too much to be comfortably carried for long periods, even if a 2nd person carries the laptop</li>
</ol>
<p>The best thing about this setup was the flexibility for mounting things. I had enough space to keep adding any hardware I might have needed or wanted to try out. After all, my only goal in these experiments was to prove the concept.</p>
<h3 id="software">Software</h3>
<p>For my prototype, I’ve used ROS Melodic, since that’s the version I’m still mostly using, and I had it on my machine. Something I love ROS for is the existing packages and libraries. What I think is interesting, is that for the prototype I didn’t need to write any software, I was just able to put together a bunch of existing packages by having the right configuration (perhaps having a good idea of what I’m doing helped too).</p>
<figure class="center">
<img src="/images/3d_mapping/software_architecture.jpg" alt="Software Architecture Diagram" />
<figcaption>ROS nodes I've run for a proof of concept</figcaption>
</figure>
<p>To make testing easier I’ve created a xacro file that defined all the dimensions and components, together with their position. Bein able to visualise the setup made testing easier as I could double check the state estimation matches reality:</p>
<figure class="center">
<img src="/images/3d_mapping/urdf.png" alt="URDF Robot Model" />
<figcaption>Sensor placement in urdf/xacro</figcaption>
</figure>
<p>This data on where sensors are with respect to one another is used throughout the system. Not only it allows to accurately represent the PointCloud in RVIZ but it also is used by EKF nodes for checking the relative positions between elements.</p>
<p>The localization is a classic ROS ekf stack described in detail <a href="http://docs.ros.org/en/noetic/api/robot_localization/html/index.html">on these pages</a>. Running this EKF allows us to get an estimated position of the robot in a fixed coordinate frame (<code class="language-plaintext highlighter-rouge">map</code> or <code class="language-plaintext highlighter-rouge">odom</code>). Setting the Fixed frame to map in RVIZ then allows us to visualised all the LiDAR points as you can see here:</p>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/3d_mapping/3d_run.mp4" type="video/mp4" />
</video>
<figcaption> An OK mapping run
</figcaption>
</figure>
<p>This is one of the 2 mapping runs I’ve run with this hardware, with a hand-tuned heading offset. This result is exactly what I need to prove the concept. If you know <a href="https://en.wikipedia.org/wiki/Pareto_principle">Pareto principle</a> I firmly believe that in 20% of the time, I’ve shown 80% of the system functionality. The remaining 80% time required is a solid grind, with lots of tuning and field trips to create something viable.</p>
<h2 id="lessons-learned">Lessons Learned</h2>
<p>I’ve learned a tonne in this project, so naturally, there is some brain dump:</p>
<ul>
<li>V-Slot 2020 profiles are amazing, if you work with them I highly recommend 3D printing or sourcing some caps that you can place on the profile ends. They are quite sharp, without caps you don’t want them anywhere near surfaces you care about</li>
<li>Start work with some computer that will run everything and have a small screen, running everything on the laptop, and carrying a heavy prototype is not a great idea</li>
<li>In my experience so far, every trial takes me between 3-5 hours, mostly because I’m in the city, and quiet places with good views of the sky are hard to come by. If you are using ROS then record bagfiles and use them for tuning the system (especially localization and IMU integration)</li>
<li>Use checklists - on one occasion I had all the cables I needed by pure luck. I highly recommend using a checklist for field packing and performing experiments. Ensuring your GNSS unit is connected to the router when testing will surely save you some debugging time as well!</li>
</ul>
<h3 id="further-thoughts">Further thoughts</h3>
<p>For a project like this timing is everything. Both the LiDAR and the IMU that I’ve used support sync input, while Septentrio’s Mosaic can easily be configured to provide sync output. Instead of using EKF as I did, perhaps the data could be used directly with very little sensor fusion. This would probably be the approach I would have taken if I was developing a handheld module.</p>
<p>Instead, I’m way more likely to put this kind of hardware on a mobile robot of some kind, making sensor fusion a much more likely requirement as the precise position and orientation would be used for other aspects than mapping.</p>
<h2 id="next-steps">Next steps?</h2>
<p>Right now I don’t have any plans to pursue this project further as it is. I’m glad that I was able to prove the concept but I don’t have a use case for such a device. I’ll use these learnings in future robot integration projects. If someone wants to build off the work I’ve described here for an open-source project then I’ll be more than happy to help out, but for now, it’s time to focus on other ideas that I have in the pipeline!</p>
<p><a href="https://msadowski.github.io/3d-mapping-with-ros/">Making a 3D mapping prototype with ROS</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on June 28, 2021.</p>
Robotics ROS https://msadowski.github.io/rtk-plug-n-play-with-septentrio2021-03-31T00:00:00-00:002021-03-31T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>Quite a while ago I’ve received two units of <a href="https://shop.septentrio.com/en/shop/mosaictm-development-kit">mosaic development kits</a> from Septentrio. This blog posts sums up my experience with it and provides some insights into integrating these with ROS.</p>
<!-- more -->
<h2 id="first-impressions">First impressions</h2>
<p>The first thing that you notice when unpacking the module is the size of the carrier board at 10x16 cm it’s quite large, however, the core module is only 7x5 cm.</p>
<p>I don’t think I’ve ever used a positioning system that would be that simple to set up. <a href="https://youtu.be/hrL5J6Q5gX8">This video</a> that is less than 2 minutes explains the basics on how to connect to the module. The gist of it is you connect the receiver antenna, attach a USB cable and navigate to the local webserver, hosted by the module in your web browser (192.168.3.1) and poof the interface is there:</p>
<figure class="center">
<img src="/images/septentrio/iface.png" alt="Septentrio web interface" />
<figcaption>Septentrio Web Interface</figcaption>
</figure>
<p>Need to set up the module to output NMEA sentences on a USB port? Simple! Go to NMEA/SBF, select connection type (USB), desired serial port, message types (e.g. GGA, RMC) and interval you are interested in and you are good to go. The dev USB port on Septentrio (the micro-USB port on the carrier board) creates two serial ports that you can use on your target machine. Being able to configure UDP or TCP streams is also an interesting option.</p>
<p>The base station-rover setup is very easy as well - the <a href="https://youtu.be/UVUVXpA8rB4">instructional video</a> has less than 3 minutes, once you know what you are doing this setup likely takes way less than a minute.</p>
<p>What is also worth noting is the number of carriers that are supported. In my test the base station was receiving position from 4 systems: GPS, GLONASS, Galileo, BeiDou:</p>
<figure class="center">
<img src="/images/septentrio/fix_base.png" alt="Available satellites" />
<figcaption>Ground station status page</figcaption>
</figure>
<h2 id="testing">Testing</h2>
<h3 id="hardware-setup">Hardware setup</h3>
<figure class="center">
<img src="/images/septentrio/3d_printed_mount.jpeg" alt="Septentrio in 3D printed mount" />
<figcaption>A mount I 3D printed to attach mosaic dev kit to tripod and hold antenna</figcaption>
</figure>
<p>To run some integration tests with ROS I’ve decided to make two tripods, each carrying a Septentrio mosaic-X5 kit, antenna, battery and a wireless router. One of the tripods is used as a base station, while the second one is used as a rover (I’ll carry this one in hand when testing). Additionally, the rover has an IMU with a magnetometer attached next to the antenna.</p>
<figure class="center">
<img src="/images/septentrio/base_tripod.jpeg" alt="Tripod with base module and router" />
<figcaption>Base setup</figcaption>
</figure>
<p>As I mentioned in the previous section I’ve set up an NMEA USB stream for my tests. To do this I created a new udev rule for the modules (the full workflow being similar to the one described in my <a href="https://msadowski.github.io/linux-static-port/">static USB tutorial</a>):</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>KERNEL=="ttyACM*", ATTRS{idVendor}=="152a", MODE="0666"
</code></pre></div></div>
<p>I’ve set the two MikroTik routers to PTP Bridge mode, making sure the base station is outputting correction on TCP port and that the rover module is subscribing to them.</p>
<h3 id="ros-setup">ROS setup</h3>
<p>ROS setup is quite classic for mobile robot setup using navsat_transform_node, robot_localization, imu driver and navsat_nmea_driver. Let’s go over the most interesting bits of my test setup.</p>
<h4 id="navsat-driver">Navsat driver</h4>
<p>I’ve decided to use the <a href="https://github.com/ros-drivers/nmea_navsat_driver">nmea_navsat_driver</a> for reading NMEA data from Septentrio through a serial port. The reason I opted for this is the simplicity - I could set it all up and be testing in no time. Or so I thought until I’ve discovered that nmea_navsat_driver has this thing that will prepend the frame_id you provide it with <code class="language-plaintext highlighter-rouge">/</code>. This in turn breaks the latest version of robot_localization. This seems to have been fixed in Noetic but is a <a href="https://github.com/ros-drivers/nmea_navsat_driver/pull/33#issuecomment-516641933">wontfix for Melodic</a>.</p>
<h4 id="localization">Localization</h4>
<p>Since for quite a while I’ve wanted to create a handheld device for some data capture (hopefully more info coming soon) I’ve decided on the following assumptions:</p>
<ul>
<li>Single EKF - for any mobile robot setup you would most likely use a <a href="http://docs.ros.org/en/noetic/api/robot_localization/html/integrating_gps.html">dual EKF setup</a> but since I don’t have any sensor providing continuous measurements in planar axes a single EKF instance should be sufficient</li>
<li>IMU fusion for orientation and heading - could be useful in the future when I start pointing my setup at things</li>
<li>WGS84 -> fixed frame transform - I’ve set up robot_localization to publish this transform so that mapviz would display my ‘robot’ in the global frame</li>
</ul>
<h3 id="test-results">Test results</h3>
<p>For this blog post I’ve performed two experiments:</p>
<ol>
<li>Static test - both rover and the base are placed in a static location for 15 minutes, then for around 10 minutes we run the localization stack and observe changes to the local position</li>
<li>Dynamic test - trying to trace a pattern drawn on the ground</li>
</ol>
<p>If you are a robotics engineer and not getting enough sun exposure then I highly recommend testing GNSS modules, this setup allows for some proper vitamin D boost:</p>
<figure class="center">
<img src="/images/septentrio/test_setup.jpeg" alt="Test setup for a sunny day" />
<figcaption>I'm really glad I bought camping equipment at one point of my life</figcaption>
</figure>
<h4 id="static-test">Static test</h4>
<p>At first, I was thinking of setting up a 10x10cm grid in mapviz and test if the base frame in the local coordinate system ever moves outside the grid cell during the test. Boy, was I pessimistic…</p>
<p>The test I actually performed was launching the robot_localization, causing the local frame to be at the initial position of the rover receiver, and letting it run for around 10 minutes without touching any tripods. Here is the X and Y position (thanks <a href="https://github.com/facontidavide/PlotJuggler">PlotJuggler!</a>):</p>
<figure class="center">
<img src="/images/septentrio/test_1_xy.png" alt="X and Y position on a single graph" />
<figcaption>Graph with X and Y position change over time</figcaption>
</figure>
<p>As you can see in the graph, throughout the test the position didn’t significantly move. The lowest measurement on the above graph is -0.010895 m (that’s < 1.1 cm). The ground I’ve set up on wasn’t flat and it was quite windy. I have no idea if these factors could cause the module to shift slightly but the result is amazing as it is, working on wheeled-mobile robots I would not need higher precision anyway.</p>
<figure class="center">
<img src="/images/septentrio/test_1_cov.png" alt="Covariance" />
<figcaption>Covariance recorded during the static test</figcaption>
</figure>
<p>The covariance output from nmea_navsat_driver looks a bit higher than I would expect from a unit with an RTK fix. I will need to check how it’s calculated as the high values of covariance could be an issue when fusing in wheel odometry in later stages of the integration.</p>
<p>What is interesting in the static test is the precision of height data that is not far away from what’ve seen for XY-axes:</p>
<figure class="center">
<img src="/images/septentrio/test_1_z.png" alt="Z axis plot for static test" />
<figcaption>Relative height captured throughout the test</figcaption>
</figure>
<h4 id="dynamic-tests">Dynamic tests</h4>
<p>The dynamic tests I performed suffer from one issue: I don’t have any reference information to compare the measurements against. For having the absolute idea about the quality of measuerements some proper geodetic methods should be used (for example a <a href="https://en.wikipedia.org/wiki/Total_station">total station</a>).</p>
<p>For this test, I traced a roughly rectangular path and walked around it carrying the rover unit. The measurements won’t be very consistent as I was holding the whole rover setup in hand while walking the marked path. Here is the GNSS odometry position:</p>
<figure class="center">
<img src="/images/septentrio/test_2_xy.png" alt="Path graph" />
<figcaption>GNSS pose in local frame</figcaption>
</figure>
<p>The data looks very good, I believe once covariance is addressed this setup should pretty much work out of the box on any mobile robot as the global EKF providing map->odom transform.</p>
<h2 id="closing-thoughts">Closing thoughts</h2>
<p>When I mentioned Plug&Play in the title of this blog post I wasn’t kidding. I like how Septentrio made it easy to work with these modules. Everything I needed to set-up was explained in the videos, with these tutorials rarely exceeding 3 minutes. The quality of support and documentation that you get from Septentrio is also top-notch.</p>
<p>At the same time there are some features that I will be looking forward to testing in the future:</p>
<ul>
<li>Using SECORX positioning service and/or NTRIP service integration for corrections</li>
<li>Enabling PPS and NTP server on the mosaic’s side</li>
<li>Switching from NMEA output to SBF (Septentrio Binary Format) and testing <a href="https://github.com/septentrio-gnss/septentrio_gnss_driver">ROSaic driver</a> or <a href="https://github.com/Team-Abhiyaan/mosaic_gnss_driver">mosaic_gnss_driver</a></li>
<li>Upgrading the firmware on Septentrio and running some more test on increased update rate (so far I’ve been outputting position at 10 Hz)</li>
<li>Eventually testing <a href="https://www.septentrio.com/en/products/gnss-receivers/rover-base-receivers/receivers-modules/mosaic-h">mosaic-H</a> with dual antenna setup for receiving heading information</li>
</ul>
<p>Now, what would be a really, really interesting test? Taking this setup, adding a quality IMU and a LiDAR unit to it, creating a very precise 3D mapping module.</p>
<p><a href="https://msadowski.github.io/rtk-plug-n-play-with-septentrio/">Plug&Play RTK with Septentrio mosaic-X5 Dev Kit</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on March 31, 2021.</p>
Robotics ROS https://msadowski.github.io/2-years-remote-consulting2020-08-26T00:00:00-00:002020-08-26T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>It’s that time of year again when I’m thinking back on the journey I set up for myself by deciding to go into Robotics Consulting. This article is a follow up to the blog post I wrote <a href="https://msadowski.github.io/one-year-of-robotics-consulting/">last year</a>. This year, I’ve decided to try and cover the most important aspects of what I’m doing, and how it has been working out for me.</p>
<!-- more -->
<h2 id="the-work-i-had">The work I had</h2>
<p>I’m very grateful for the kind of work that I do - it feels amazing doing what I love and getting paid for it. I would never change my work for anything else. Every single one of my clients has an interesting problem to solve, and I’m yet to get my first negative experience working with someone. Looking back on all my clients, I’ve worked with many interesting people all around the world, mostly helping them to develop software for mobile robots and drones.</p>
<figure class="center">
<img src="/images/2_yr_consulting/clients.png" alt="Map with 22 Clients I had all over the world" />
<figcaption>A world map of the 22 clients I've worked with over the past 2 years</figcaption>
</figure>
<figure class="center">
<img src="/images/2_yr_consulting/clients_europe.png" alt="Image showing 8 of my clients in Europe" />
<figcaption>My European clients</figcaption>
</figure>
<p>The duration of the projects I take on are between half a day and 2.5 years long. Everything depends on the client and what they require. Most of my projects kick off with a consultation - helping the client to solve a particular problem by sharing my expertise and experience. After this initial phase, some of the clients decide to hire me as a contractor, helping their technical team solve the challenges they face, while others are happy to carry on working by themselves with the advice received.</p>
<p>As a tangent to the consultations, I also help evaluate robotics funding proposals, as a reviewer in EU technological funding projects <a href="http://www.esmera-project.eu/welcome/">ESMERA</a> and <a href="https://trinityrobotics.eu/">TRINITY</a>, making sure your tax euros are spent on truly cutting-edge projects.</p>
<h3 id="putting-myself-in-my-clients-shoes">Putting myself in my clients’ shoes</h3>
<p>Every project I take on, I treat as my own. This results in me trying to solve issues before they happen, and sometimes even pushing back on client’s decisions - “How about we don’t remove this safety feature?”. Another side effect of treating the project as my own is that I put the client first, before my own business interest, meaning I might earn less for the project overall.</p>
<p>As an example, I was hired to test the idea a client had for a robotic project. The idea turned out to be borderline feasible, with currently available technology, but potentially requiring lots of R&D work. If I sold the project as “no problem, it’s totally doable”, I would definitely get more work supporting the R&D work required. By putting the project and client’s interests first, however, we finished the project after the feasibility study, having realised the extent of R&D work needed.</p>
<p>Since terminating such a project before going to an R&D phase is something I would do, I’m content and the client is most likely even more so, since they don’t have to spend lots of money, not understanding the R&D effort required. Instead, they can decide whether to pursue the idea further. If the client does decide to take a risk, they know that I’ll be there, ready to treat their project as my own, and providing honest advice, even if it means I get less work from it.</p>
<h3 id="the-hardware">The hardware</h3>
<p>Working with hardware is one of the most enjoyable parts of my job. This year, I’ve been focusing a lot on RTK solutions (I have one blog post in the works on a neat RTK setup). Also this year, I finally got my hands on a multi-plane LiDAR, something I’ve been very keen on ever since Velodyne revealed their first multi-plane units.</p>
<figure class="center">
<img src="/images/2_yr_consulting/hardware.png" alt="Hardware units I've worked with" />
<figcaption> Some of the hardware I've worked with since I started working as a Remote Robotics Consultant</figcaption>
</figure>
<h2 id="the-work-i-did-not-win">The Work I Did Not Win</h2>
<h3 id="the-dream-project">The dream project</h3>
<p>At one point, I was approached by a client who wanted to create a certain type of robot that I’ve always dreamt of working with. The task would require me to liaise with manufacturers and developers to get the robot ready, and enable support for autonomous navigation in various terrain types. In short, a hugely challenging project that would immensely help me grow, whilst being super fun at the same time. When discussing the project in more detail, I noticed that the budget for this client was not an issue - any platform for any price was OK, as long as it met the requirements. However, when asking follow up questions, it became clear that the project target was to equip robots with weapons.</p>
<figure class="center">
<video controls="controls" class="center" style="width:100%">
<source src="/images/2_yr_consulting/bd.mp4" type="video/mp4" />
</video>
<figcaption>The kind of project I’m not keen to work on. Source: <a href="https://www.youtube.com/watch?v=y3RIHnK0_NE">Bosstown Dynamics</a> (Corridor Digital)
</figcaption>
</figure>
<p>When I was applying to University, I promised myself I would not work on weapons, even if these projects have a virtually unlimited budget. I figured that even if I was to fold my consultancy, I would rather do that than compromise on my values.</p>
<h3 id="the-bargain">The bargain</h3>
<p>In the two years working as a Robotics Consultant, I’ve noticed a pattern: the more a potential client bargains before starting the job, the more bargaining and complaining will follow, resulting in an unpleasant experience for everyone involved. In the same bucket as bargaining clients are the UpWork projects that want you to create a SLAM library from scratch for a fixed price of $50. These days, I tend to reject these types of clients and contracts straight away, saving everyone’s time.</p>
<h3 id="watch-out-for-the-legal-stuff">Watch out for the legal stuff</h3>
<p>If I’m receiving 25 pages of legal documents to review before engaging with a customer, it’s a potential red flag. I don’t mind reading legal documents, but I would rather not involve a lawyer before even starting a project. For me, a huge red flag is when a potential client adds a very vaguely described 3-year non-compete clause, covering a whole industry. I find such terms too restrictive and would never take a project on like this, unless the project also paid for a 3 year holiday after wrapping everything up!</p>
<h3 id="the-lack-of-expertise">The lack of expertise</h3>
<p>As much as I would love to know everything about Robotics, it’s not likely to happen anytime soon. I will never take on projects that fall outside of my expertise (for example, algorithms for soft robotics), unless the client is persistent and understands an R&D process. My go-to advice in such cases is that it’s not feasible to pay my high rates for me to catch up on the topic. In a hypothetical situation, if the project falls outside of my expertise, but it’s in an area I’m planning to pursue, I’d offer a very low rate for the project, highlighting that I will need to do some catching up.</p>
<h2 id="remote-robotics">Remote Robotics</h2>
<p>People often ask me how I work on robotics projects remotely (I’ve been doing this since day 1 of my consultancy). When I’m doing some high level work, it’s usually not an issue. It only becomes slightly more problematic with hardware-oriented projects. The options I’ve tested so far are:</p>
<ul>
<li>For ROS systems, working with bag files, and in the case of drones, working with log files</li>
<li>Clients shipping the hardware to me</li>
<li>Travelling to the client’s location</li>
<li>Remote control</li>
</ul>
<figure class="center">
<img src="/images/2_yr_consulting/office.jpg" alt="My office in the times of pandemic" />
<figcaption>Me and my partner's office in the time of the COVID-19 pandemic</figcaption>
</figure>
<p>ROS is absolutely the best thing that happened to people like me - I can easily review things online and even develop software with just <a href="http://wiki.ros.org/rosbag">bagfiles</a>. However, it’s not always feasible to work with bags, especially if you need to do some work related to sensors and actuators. In these cases, clients will often ship their hardware to me, so that I can integrate it locally. A very convenient way to work with clients in this way is to use a temporary import - it saves you the risk of paying taxes and duties for the expensive robotics parts, and as far as I know when doing a temporary import, you can keep the items for up to a year.</p>
<p>Travelling to the client’s location usually results in a couple of days of a hackathon solving issue. I love these, as it allows me to put the faces to Slack usernames, and I like the dynamics of these kinds of projects. On the flip side, the last time I did it, I ended up doing about 60 hours of work in 5 days - not very feasible, especially as you need to rest for a couple of days afterwards, but I would do it again!</p>
<p>Remote control of the client’s desktop is something that I try to avoid as much as possible when working with hardware. I find that not having physical access to the components is often very limiting (“can you please unplug the cable for me?”) and I estimate that in the worst cases, you are 20-40% less productive by working on robotics in this way.</p>
<h2 id="other-thoughts-on-consultancy-and-self-employment">Other Thoughts on Consultancy and Self-Employment</h2>
<h3 id="freedom">Freedom</h3>
<p>Freedom is one of the most important aspects of what I do. If I don’t work with hardware for a project, then I can usually work from whatever place I want. If I have a day when I don’t feel like working, or can’t work, then taking a day off is not an issue. To an extent, I can choose whichever projects to work on (or at least to which I can say “no” to).</p>
<figure class="center">
<img src="/images/2_yr_consulting/outdoors.jpg" alt="A mountain view" />
<figcaption>Having a possibility to go out into the mountains in the middle of the week is something I appreciate a lot</figcaption>
</figure>
<p>This freedom, however, comes with a price attached to it - if I’m not working, then I’m not making money. I don’t get bank holidays and if I get sick and can’t work, then I’m not earning either. These are the main reasons why I think working for oneself is not for everybody, especially when having no projects lined up, as it can become quite stressful.</p>
<p>Being self-employed means that you are at the centre of your business. For me, this means that my physical and mental health are my number one priorities. Examples of how I exercise these are:</p>
<ul>
<li>Three times a week, around 4pm, I do sports (normally, I go to the gym, but during the pandemic, I’ve been going for a run)</li>
<li>Practising meditation as the first thing I do in the morning</li>
<li>In case of health issues, I look to fix them without thinking twice about the money (being located in Europe is a huge advantage here as I know no doctor’s visit will ruin me, financially)</li>
</ul>
<h3 id="upwork">Upwork</h3>
<p>Over the past year, most of my work has come from Upwork (you can see my profile <a href="https://www.upwork.com/freelancers/~0196b3ccb97605e632">here</a>). I won’t repeat the things I said <a href="https://msadowski.github.io/one-year-of-robotics-consulting/#upwork">last year</a> - I’ve grown to appreciate the service. I realised that the 20% fee for the first $500 doesn’t hurt as much when you target long-term or high bid projects. The value I get in terms of not having to chase customers for payment, easily logging the time, and the discoverability, makes it worth it in my case. Some people criticise Upwork for the screenshot it takes of the screen while you are working for hourly projects, but for me, this is a non-issue - when I’m working on the project, I’ll never have anything private open and I’ll be just doing the work I charge customers for. If a project required me to have a webcam on though (it’s an optional requirement I believe), I wouldn’t accept it.</p>
<figure class="center">
<img src="/images/2_yr_consulting/top_rated_plus.png" alt="UpWork Top Rated Plus badge" />
<figcaption>My hard-earned Upwork Top Rated Plus badge</figcaption>
</figure>
<p>If you are looking into starting on Upwork, then I’d recommend figuring out the best way to get paid, without losing out on payment and currency conversion fees. The most cost-efficient way I came across was setting up a <a href="https://transferwise.com/invite/u/mateuszs27">TransferWise account</a> (<– with this link, you’ll get your first transfer of up to 500 GBP for free). In TransferWise, I’ve set up a US account allowing me to handle ACH payments (the method Upwork uses to pay me). That way, I don’t pay anything to get the money from Upwork, but instead, I only pay for the TransferWise exchange rate, which is the best I found when I was looking into this.</p>
<p>To save time on browsing Upwork projects, I set up an RSS feed with the keywords that interest me. That way, I can see all of the projects that were posted on the platform, without having to go through the same ones by visiting the website.</p>
<h3 id="mastermind-group">Mastermind group</h3>
<p>For over a year now, I’ve been having weekly <a href="https://en.wikipedia.org/wiki/Mastermind_group">Mastermind group</a> meetings with my friend, Michał (if you need to tell a visual story, then I highly recommend <a href="https://www.michalkalina.com/">getting in touch with him</a>). In these weekly calls, we discuss the business and problems we are facing. Having someone to bounce ideas around is very helpful, especially if you can get honest feedback from someone running into similar issues as you do.</p>
<figure class="center">
<img src="/images/2_yr_consulting/mastermind.png" alt="Me and Michał during one of our Mastermind meetings" />
<figcaption>Me and Michał during one of our Mastermind meetings. Do you also think outdoor business meetings should be the norm?</figcaption>
</figure>
<h3 id="attention-switching-penalty">Attention switching penalty</h3>
<p>This is advice to all of you working on multiple projects at the same time. In my experience, every time I switch focus from one project to another, it takes about 10-20 minutes to get into the zone, and to not think about the other one. Because of this, I try to switch between projects as little as possible - ideally less than 2-3 times a day. Most of the time before a switch, I try to take a short break.</p>
<h2 id="closing-thoughts">Closing thoughts</h2>
<p>Having worked for over 2 years as a Remote Robotics Consultant, I can’t imagine doing anything else. I take lots of satisfaction and pride in my work, and I hope clients notice my focus on uncompromised quality. There is a small issue though - if your work is your hobby - when do you rest? I realised I might have been pushing myself a bit too hard during lockdown (having a clear split between the office and home was probably one of the reasons I was working from the office in the first place). Looking out into the future, I’m hoping to rent a proper workshop and work from there.</p>
<p>Do you have any questions about my work? Feel free to leave a comment and I’ll answer it! And if you would like to work with me, then feel free to send me an <a href="mailto:[email protected]">e-mail</a></p>
<p><a href="https://msadowski.github.io/2-years-remote-consulting/">Thoughts on 2 Years as a Remote Robotics Consultant</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on August 26, 2020.</p>
Robotics https://msadowski.github.io/ouster-os1-ros-review2020-06-26T00:00:00-00:002020-06-26T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>Recently I’ve started integrating Ouster OS1-16 on a client’s platform and he was nice enough to allow me to run some tests with this LiDAR sensor. This post describes my first impression with this unit and provides some pointers that can be useful for anyone looking into using Ouster sensors with Robot Operating System.</p>
<!-- more -->
<h2 id="hardware">Hardware</h2>
<table>
<thead>
<tr>
<th style="text-align: left">Parameter</th>
<th style="text-align: left">Value</th>
</tr>
</thead>
<tbody>
<tr>
<td style="text-align: left">Range</td>
<td style="text-align: left">0.25-120m (at 80% reflectivity)</td>
</tr>
<tr>
<td style="text-align: left">Range resolution</td>
<td style="text-align: left">0.3 cm</td>
</tr>
<tr>
<td style="text-align: left">Horizontal FoV</td>
<td style="text-align: left">360°</td>
</tr>
<tr>
<td style="text-align: left">Vertical FoV</td>
<td style="text-align: left">33.2°</td>
</tr>
<tr>
<td style="text-align: left">Rotation frequency</td>
<td style="text-align: left">10 or 20Hz</td>
</tr>
<tr>
<td style="text-align: left">Laser wavelength</td>
<td style="text-align: left">865nm</td>
</tr>
<tr>
<td style="text-align: left">Operating power</td>
<td style="text-align: left">14-20W</td>
</tr>
<tr>
<td style="text-align: left">Operating voltage</td>
<td style="text-align: left">24V</td>
</tr>
<tr>
<td style="text-align: left">Interface</td>
<td style="text-align: left">Gigabit Ethernet (UDP)</td>
</tr>
<tr>
<td style="text-align: left">Price</td>
<td style="text-align: left">~3.5k USD</td>
</tr>
</tbody>
</table>
<h2 id="setup">Setup</h2>
<p>Here are some useful links that should get you started if you wanted to use OS1 sensors:</p>
<ul>
<li><a href="https://ouster.com/resources/">Ouster Resoures</a> - Software User Guide is a must read before starting with this LiDAR</li>
<li><a href="https://github.com/ouster-lidar/ouster_example">Ouster ROS driver</a> - The driver developed by the Ouster team</li>
<li><a href="https://github.com/SteveMacenski/ros2_ouster_drivers">ROS 2 Ouster Drivers</a> - Drivers developed by Steve Macenski</li>
<li><a href="https://ouster.com/blog/building-maps-using-google-cartographer-and-the-os1-lidar-sensor/">Building Maps Using Google Cartographer and the OS1 Lidar Sensor</a> - Very useful tutorial</li>
</ul>
<figure class="center">
<img src="/images/ouster/ouster_image.png" alt="Raw Ouster measurements" />
<figcaption>Ouster OS1-16 measurements in Ouster Studio</figcaption>
</figure>
<h3 id="networking">Networking</h3>
<p>All the tutorials mention setting up DHCP for Ouster sensors, however, if you are able to find the sensor’s IP address then in my experience so far you can use it directly. For these initial tests, I’ve used IP only and didn’t have any issues with the Ouster Driver, Ouster studio, and connecting to the sensor using Netcat.</p>
<p>I found netcat to be very useful in my first tests the commands that I was running were:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>nc 10.42.0.245 7501
get_sensor_info
get_time_info
set_config_param timestamp_mode TIME_FROM_PTP_1588
reinitialize
write_config_txt
</code></pre></div></div>
<p>Among these commands <code class="language-plaintext highlighter-rouge">get_time_info</code> was very useful to roughly check the sensor timestamp against my computer time and make sure the clocks are synchronized.</p>
<h3 id="time-syncronization">Time syncronization</h3>
<p>Handling time properly is very important, especially if you want to display your data live in software like RVIZ. Here are the timing options that you can find in ouster (via the Software User Guide):</p>
<ul>
<li><code class="language-plaintext highlighter-rouge">TIME_FROM_INTERNAL_OSC</code> - Use the internal clock. Measurements are time stamped with ns since power-on. Free running counter based on the OS1’s internal oscillator. Counts seconds and nanoseconds since OS1 turn on, reported at ns resolution (both a second and nanosecond register in every UDP packet), but min increment is on the order of 10ns. Accuracy is +/- 90ppm.</li>
<li><code class="language-plaintext highlighter-rouge">TIME_FROM_SYNC_PULSE_IN</code> - A free running counter synced to the SYNC_PULSE_IN input counts seconds (#of pulses) and nanoseconds since OS1 turn on. If multipurpose_io_mode is set to INPUT_NMEA_UART then the seconds register jumps to time extracted from a NMEA$GPRMC message read on the multipurpose_io port. Reported at ns resolution (both a second and nanosecond register in every UDP packet), but min increment is on the order of 10 ns. Accuracy is +/- 1 s from a perfect SYNC_PULSE_IN source.</li>
<li><code class="language-plaintext highlighter-rouge">TIME_FROM_PTP_1588</code> - Synchronize with an external PTP master. A monotonically increasing counter that will begin counting seconds and nanoseconds since startup. As soon as a 1588 sync event happens, the time will be updated to seconds and nanoseconds since 1970. The counter must always count forward in time. If another 1588 sync event happens the counter will either jump forward to match the new time or slow itself down. It is reported at ns resolution (there is both a second and nanosecond register in every UDP packet), but the minimum increment varies. Accuracy is +/-<50us from the 1588 master.</li>
</ul>
<p>Initially, I was trying to use <code class="language-plaintext highlighter-rouge">TIME_FROM_SYNC_PULSE_IN</code> with my <a href="https://msadowski.github.io/ardusimple-ros-integration/">ArduSimple boards</a>. It worked OK but if you read the description of the mode above you’ll notice it says “If multipurpose_io_mode is set to INPUT_NMEA_UART then the <strong>seconds register jumps</strong>” - this explains why accuracy is +/- 1s from the pulse in source.</p>
<p>Another thing to note about using <code class="language-plaintext highlighter-rouge">TIME_FROM_SYNC_PULSE_IN</code> is that with 1.13 software <a href="https://github.com/ouster-lidar/ouster_example/issues/154">there is a bug</a> which causes the reported time to be off by 24 hours from the time provided.</p>
<p>If you need accurate timing then <code class="language-plaintext highlighter-rouge">TIME_FROM_PTP_1588</code> is the way to go. Software User Guide has a very good workflow on how to set it up, even for a single machine. It’s best if machines support ethernet hardware timestamping but if it doesn’t the software timestamping seems to work well too.</p>
<h2 id="cartographer-test">Cartographer test</h2>
<p>After making sure my launch files work with the sensor I’ve wanted to test some SLAM implementation. I ended up trying the sensor with <a href="https://google-cartographer.readthedocs.io/en/latest/">Cartographer</a>:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/NuVm9Ge_NgU" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>In my RVIZ settings, I’ve set the point cloud decay time to 4 seconds. As you can see the localization is not perfect but it’s not too bad, given I’ve used the default configuration from Ouster’s blog post on Cartographer integration.</p>
<h2 id="point-cloud-assembly">Point cloud assembly</h2>
<p>The second thing I wanted to discover for this blog post was the ways how to build a usable point cloud from the localized sensor data. This led me to <code class="language-plaintext highlighter-rouge">point_cloud2_assembler</code> from <a href="http://wiki.ros.org/laser_assembler">laser_assembler</a> package. ROS Wiki describes this package as follows:</p>
<blockquote>
<p>The laser_scan_assembler subscribes to sensor_msgs/LaserScan messages on the scan topic. These scans are processed by the Projector and Transformer, which project the scan into Cartesian space and then transform it into the fixed_frame. This results in a sensor_msgs/PointCloud that can be added to the rolling buffer. Clouds in the rolling buffer are then assembled on service calls.</p>
</blockquote>
<figure class="center">
<img src="/images/ouster/point_cloud_assembler.png" alt="ROS point cloud assembler graph" />
<figcaption>Point Cloud Assembler data flow. Source: wiki.ros.org </figcaption>
</figure>
<p>To obtain this pointcloud I’ve added the following nodes to my launch file:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code> <node type="point_cloud2_assembler" pkg="laser_assembler" name="my_assembler">
<remap from="cloud" to="/os1_cloud_node/points"/>
<param name="max_clouds" type="int" value="400" />
<param name="fixed_frame" type="string" value="map" />
</node>
<node pkg="carto_demo" type="map_assembler.py" name="map_assembler"/>
</code></pre></div></div>
<p>where map_assembler.py is a simple Python script:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>#!/usr/bin/env python
import rospy
from laser_assembler.srv import AssembleScans2
from sensor_msgs.msg import PointCloud2
rospy.init_node("map_assembler")
rospy.wait_for_service("assemble_scans2")
assemble_scans_srv = rospy.ServiceProxy('assemble_scans2', AssembleScans2)
pub = rospy.Publisher ("/assembled_pointcloud", PointCloud2, queue_size=1)
r = rospy.Rate(1)
last_time = rospy.Time(0,0)
while (not rospy.is_shutdown()):
try:
time_now = rospy.Time.now()
resp = assemble_scans_srv(last_time, time_now)
last_time = time_now
pub.publish(resp.cloud)
except rospy.ServiceException, e:
print "Service call failed: %s"%e
r.sleep()
</code></pre></div></div>
<p>Since I’ve ben saving all my data to a bag file I was then able to create an assembled pointcloud by running a <code class="language-plaintext highlighter-rouge">bag_to_pcd</code> node from <a href="http://wiki.ros.org/pcl_ros">pcl_ros</a> with the following syntax:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>rosrun pcl_ros bag_to_pcd <input_file.bag> <topic> <output_directory>
</code></pre></div></div>
<p>This command will create a .pcd file with the scan data that you can later import into software like CloudCompare:</p>
<figure class="center">
<img src="/images/ouster/cloud_result.png" alt="Assembled Point Cloud" />
<figcaption>Assembled point cloud in CloudCompare</figcaption>
</figure>
<p>Looking at the point cloud you can notice quite many outliers. Most of them are most likely the result of me not tuning the Cartographer for this post, however, the single line of measurements in the top left side of the above image looks quite a bit odd. Since it happens to be where a window in my apartment is I’m wondering if this could be caused by light interference.</p>
<h2 id="next-steps">Next steps</h2>
<p>I’ll be running further tests on this sensor as I’m working on the client’s project. If I come across any significant findings I’ll make sure to update this blog post. If you would still like to know more then you can find two videos about Ouster on Steve Macenski’s <a href="https://www.youtube.com/channel/UCZT16dToD1ov6lnoEcPL6rw">Robots For Robots YouTube Channel</a>.</p>
<p><a href="https://msadowski.github.io/ouster-os1-ros-review/">Ouster OS1 - first impressions</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on June 26, 2020.</p>
Robotics ROS https://msadowski.github.io/pps-support-jetson-nano2020-04-28T00:00:00-00:002020-04-28T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>I believe that in everyone’s life there comes a time where they need to do what they’ve always feared: they need to compile a Unix Kernel. This post is a step by step guide how I enabled the PPS support on Jetson Nano.</p>
<!-- more -->
<h2 id="background">Background</h2>
<p>For one of the projects I’m working on we wanted to have as precise timing information as it gets on a Linux board. In this particular project I opted on using <a href="https://gpsd.gitlab.io/gpsd/gpsd-time-service-howto.html">gpsd</a> together with PPS for driving the clock of Jetson Nano. During the research I’ve found a couple of dead ends and therefore I decided that I’d describe the process that I’ve followed in case anyone finds it helpful.</p>
<p>I’ve used the following process on my Jetson Nano Developer edition B01 with Jetpack 4.3 and a kernel version 32.3.1.</p>
<h2 id="useful-links">Useful links</h2>
<p>Here are the links that I’ve found most helpful while researching the steps to enable PPS on Jetson Nano:</p>
<ul>
<li><a href="https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%2520Linux%2520Driver%2520Package%2520Development%2520Guide%2Fkernel_custom.html%23">Jetson Kernel Customization</a> - should contain the most up to date process</li>
<li><a href="https://developer.ridgerun.com/wiki/index.php?title=Jetson_Nano/Development/Building_the_Kernel_from_Source">Ridgerun wiki</a> - this guide might be slightly outdated but it helped me fill in the gaps</li>
<li><a href="https://forums.developer.nvidia.com/t/pps-on-jetson-nano/75841/4">PPS on Jetson Nano</a> - a topic from Nvidia Developer Forum</li>
<li><a href="https://forums.developer.nvidia.com/t/enabling-pps-on-jetson-nano-with-jetpack-4-3/119418/10">Enabling PPS on Jetson Nano with Jetpack 4.3</a> - a topic I started on Nvidia Developer Forum asking for support in enabling the PPS support</li>
</ul>
<h2 id="tools-setup">Tools setup</h2>
<p>The paths I’ll be showing here are in the format they appear on my machine (Ubuntu 18.04), where my home directory is <code class="language-plaintext highlighter-rouge">/home/mat</code>. If you are following this tutorial make sure that you change the paths to match your setup.</p>
<ul>
<li>Install the toolchain following <a href="https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%20Linux%20Driver%20Package%20Development%20Guide/xavier_toolchain.html">these instructions</a></li>
<li>Download and install the <a href="https://developer.nvidia.com/nvidia-sdk-manager">SDK manager</a> (I’ve used the version 1.0.1.5538 in this tutorial)
After it’s downloaded, launch it and set up the development environment. In my case it looked like this:</li>
</ul>
<figure class="center">
<img src="/images/jetson_pps/sdk_manager.png" alt="SDK manager" />
<figcaption>SDK manager</figcaption>
</figure>
<p>Make sure that you select the Target Hardware to match your board. In my case it is <code class="language-plaintext highlighter-rouge">Jetson Nano (Developer Kit version) (P3448)</code></p>
<ul>
<li>In STEP 02 of the SDK I used the default settings and set the Target HW image folder as <code class="language-plaintext highlighter-rouge">/home/mat/nvidia/nvidia_sdk</code>. Then I pressed continue to download all the packages.</li>
</ul>
<h2 id="configuring-the-kernel">Configuring the kernel</h2>
<p>These steps that I’ve performed should match the instructions you will find in the Jetson Kernel Customization page I posted before but I’d advise you to cross check.</p>
<ul>
<li>Navigate to folder and create output kernel directory:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>cd ~
cd nvidia
mkdir kernel_compiled
</code></pre></div></div>
<ul>
<li>Export variables:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>export CROSS_COMPILE=$HOME/l4t-gcc/gcc-linaro-7.3.1-2018.05-x86_64_aarch64-linux-gnu/bin/aarch64-linux-gnu-
export TEGRA_KERNEL_OUT=/home/mat/nvidia/kernel_compiled
export LOCALVERSION=-tegra
</code></pre></div></div>
<ul>
<li>Sync the kernel repository by running <code class="language-plaintext highlighter-rouge">source_sync.sh</code> in <code class="language-plaintext highlighter-rouge">/nvidia/nvidia_sdk/JetPack_4.3_Linux_P3448/Linux_for_Tegra</code>. I synced to the tag <code class="language-plaintext highlighter-rouge">tegra-l4t-r32.3.1</code></li>
<li>Build the kernel source configuration:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>cd ~/nvidia/nvidia_sdk/JetPack_4.3_Linux_P3448/Linux_for_Tegra/sources/kernel/kernel-4.9
mkdir -p $TEGRA_KERNEL_OUT
make ARCH=arm64 O=$TEGRA_KERNEL_OUT tegra_defconfig
</code></pre></div></div>
<ul>
<li>Add PPS client support. In your editor of choice open the file .config in $cd $TEGRA_KERNEL_OUT directory (<code class="language-plaintext highlighter-rouge">cd $TEGRA_KERNEL_OUT</code>) and make sure the PPS is enabled as follows:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>#
# PPS support
#
CONFIG_PPS=y
# CONFIG_PPS_DEBUG is not set
#
# PPS clients support
#
CONFIG_PPS_CLIENT_KTIMER=y
CONFIG_PPS_CLIENT_LDISC=y
CONFIG_PPS_CLIENT_GPIO=y
</code></pre></div></div>
<ul>
<li>Add PPS gpio support. In ~/nvidia/nvidia_sdk/JetPack_4.3_Linux_P3448/Linux_for_Tegra/sources/hardware/nvidia/soc/t210/kernel-dts/tegra210-soc edit <code class="language-plaintext highlighter-rouge">tegra210-soc-base.dtsi</code> and add the following lines:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>pps {
gpios = <&gpio TEGRA_GPIO(B, 7) 0>;
compatible = "pps-gpio";
status = "okay";
};
</code></pre></div></div>
<h2 id="building-and-flashing-the-kernel">Building and flashing the kernel</h2>
<ul>
<li>Build the kernel:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>cd ~/nvidia/nvidia_sdk/JetPack_4.3_Linux_P3448/Linux_for_Tegra/sources/kernel/kernel-4.9
make ARCH=arm64 O=$TEGRA_KERNEL_OUT -j8
</code></pre></div></div>
<ul>
<li>
<p>Prepare the board for flashing. In case of board B01: make sure the board is powered off, short pins SYS_RST and GND to enter recovery mode, power up the board from the power supply and connect the board to the USB port of your machine</p>
</li>
<li>
<p>Flash kernel</p>
</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>cd ~/nvidia/nvidia_sdk/JetPack_4.3_Linux_P3448/Linux_for_Tegra/kernel
cp $TEGRA_KERNEL_OUT/arch/arm64/boot/Image Image
cd dtb
cp -a /$TEGRA_KERNEL_OUT/arch/arm64/boot/dts/. .
cd ~/nvidia/nvidia_sdk/JetPack_4.3_Linux_P3448/Linux_for_Tegra
sudo ./flash.sh jetson-nano-qspi-sd mmcblk0p1
</code></pre></div></div>
<ul>
<li>After the previous step you should see the board flashing. After the flashing is done you will be able to set up the system using a computer screen, keyboard etc.</li>
</ul>
<h2 id="testing-pps">Testing pps</h2>
<ul>
<li>Make sure <code class="language-plaintext highlighter-rouge">ls /dev/pps* </code> returns <code class="language-plaintext highlighter-rouge">/dev/pps0</code> and <code class="language-plaintext highlighter-rouge">/dev/pps1</code></li>
<li>Install pps-tools <code class="language-plaintext highlighter-rouge">sudo apt-get install pps-tools</code></li>
<li>Run pps test on /dev/pps0: <code class="language-plaintext highlighter-rouge">sudo ppstest /dev/pps0</code>
You should see the output like this, with no PPS device connected:</li>
</ul>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>trying PPS source "/dev/pps0"
found PPS source "/dev/pps0"
ok, found 1 source(s), now start fetching data...
source 0 - assert 1588064315.861849703, sequence: 4355 - clear 0.000000000, sequence: 0
source 0 - assert 1588064316.885912540, sequence: 4356 - clear 0.000000000, sequence: 0
source 0 - assert 1588064317.909913941, sequence: 4357 - clear 0.000000000, sequence: 0
source 0 - assert 1588064318.933846094, sequence: 4358 - clear 0.000000000, sequence: 0
source 0 - assert 1588064319.957913486, sequence: 4359 - clear 0.000000000, sequence: 0
...
</code></pre></div></div>
<ul>
<li>Now, to test the /dev/pps1 device you will need to connect your PPS source to pin 18 of the 40-pin header of the Jetson Nano (probably connecting ground pins between the PPS source and the Jetson won’t hurt either). By running a ppstest again on /dev/pps1 you should see a similar output as we had with /dev/pps0</li>
</ul>
<figure class="center">
<img src="/images/jetson_pps/jetson_ublox.jpg" alt="Jetson and Ublox" />
<figcaption><a href="https://www.ardusimple.com/">ArduSimple simpleRTK2B</a> and Jetson Nano, almost ready for PPS triggering</figcaption>
</figure>
<p>That should be it! The setup you can see in the above picture worked well for the PPS input after connecting the antennas and powering everything up. Now, with some NMEA sentences being sent from the simpleRTK2B to the Jetson and some gpsd and chrony configurations I should be able to drive the clock of the Jetson.</p>
<p>Hope these steps will save you a bit of time looking for the PPS solution, especially if like me it was the first time you were building a Kernel.</p>
<figure class="center">
<img src="https://imgs.xkcd.com/comics/cautionary.png" alt="Cautionary XKCD strip" />
<figcaption>Source: <a href="https://imgs.xkcd.com/comics/cautionary.png">XKCD</a></figcaption>
</figure>
<p><a href="https://msadowski.github.io/pps-support-jetson-nano/">Enabling PPS on Jetson Nano</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on April 28, 2020.</p>
Robotics https://msadowski.github.io/robosynthesis-3d-camera2020-04-23T00:00:00-00:002020-04-23T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>Together with <a href="https://www.robosynthesis.com/">Robosynthesis</a> we’ve been working on developing and integrating a 360 camera module for our ROS based industrial inspection robots. I was so pleased with the results that I couldn’t pass on the opportunity to share what we’ve been up to.</p>
<!-- more -->
<h2 id="background">Background</h2>
<p>In one of the projects we have worked on we run into an issue - teleoperating a robot in an environment filled with obstacles was really difficult with the cameras we had at the time. In that particular project we ended up with the following camera setup:</p>
<ul>
<li>One camera was pointed forward</li>
<li>Two cameras were pointing to the sides</li>
<li>Fourth camera was attached at the top of the robot and looking down, allowing the operator to get a good view of what’s next to the robot’s wheels</li>
</ul>
<p>The user interface for this setup looked something like this:</p>
<figure class="center">
<img src="/images/360_camera/ui.png" alt="First iteration UI" />
<figcaption>Not the actual interface</figcaption>
</figure>
<p>Each of the above boxes was providing the user with the camera view. This worked well enough but wasn’t very intuitive. Months after that we’ve started exploring what we’ve called <a href="https://www.robosynthesis.com/post/helicopter-360-degree-rgb-camera-view-for-robotic-teleops">a helicopter 360 view</a>.</p>
<figure class="center">
<img src="/images/360_camera/helicopter.png" alt="First shot at helicopter view" />
<figcaption>4 rectified camera images almost ready to be stitched together</figcaption>
</figure>
<p>We have managed to come with the solution that allowed rectifying images from four cameras simultaneously. The idea at the time was that if we get a high enough field of view we should be able to create a nice top down projection of the environment around the robot. There is definitely a value in a top-down projection around the robot but why constrain yourself to 2D view when you can go full 360?</p>
<h2 id="360-camera-ros-module-in-action">360 camera ROS module in action</h2>
<p>Before we jump into the technical details of the solution let me show you how it works:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/MlQfebtZFV0" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
<p>In this video I’m able to use my mouse to look around the 360 sphere that has the camera feeds overlayed. The interface is very intuitive - you just click and drag the mouse to look at the environment and use the scroll wheel to zoom in or out. There are couple of things I like about this solution:</p>
<ul>
<li>Coverage - compared to the alternative solutions we are looking everywhere around the robot at the same time. This is great for inspection because we can record the data and play it back later on allowing us to focus on many points of interest</li>
<li>Ease of use - this solution is way more intuitive to the operator than our previous 4 separate camera views</li>
<li>Manageable latency - you can actually operate a robot using this view as the only camera feed</li>
</ul>
<p>The first thing to get to our solution was to get USB cameras with a high field of view. We ended up with 220 degrees FoV USB cameras. The first test was getting the image feed from a single camera:</p>
<figure class="center">
<img src="/images/360_camera/first_test.png" alt="First image captured with the high FoV camera" />
<figcaption>Hello World! (That's a proper fisheye view, isn't it?)</figcaption>
</figure>
<p>Then we entered a very rapid prototyping stage with this <em>handy</em> prototype:</p>
<figure class="center">
<img src="/images/360_camera/first_prototype.jpg" alt="First prototype" />
<figcaption>It took about 5 minutes to assemble this one</figcaption>
</figure>
<p>This first prototype allowed me to quickly prove the concept and prepare and test the software stack. In less than a week from the first proof of concept the team took it to the next level:</p>
<figure class="center">
<img src="/images/360_camera/final_camera.jpg" alt="360 Camera mounted on the robot" />
<figcaption>Final result mounted on the robot</figcaption>
</figure>
<p>In this short period of time we have:</p>
<ul>
<li>Designed the casing</li>
<li>Added a flood light that can be triggered with a ROS service call</li>
<li>Made it into a module that can be plugged in anywhere on the robot’s deck</li>
</ul>
<figure class="center">
<img src="/images/360_camera/flood_light.jpg" alt="Floodlight module in action" />
<figcaption>Let there be light</figcaption>
</figure>
<figure class="center">
<img src="/images/360_camera/flood_light.gif" alt="Animation showing the flood light module" />
<figcaption>Where there is robot there is light</figcaption>
</figure>
<h2 id="technical-details">Technical details</h2>
<p>I hope that at this point you are wondering how we make this 360 view. The core piece of software that we’ve used is the <a href="https://github.com/UTNuclearRoboticsPublic/rviz_textured_sphere">RVIZ Textured Sphere</a> open source plugin developed by Researchers from <a href="https://robotics.me.utexas.edu/">the Nuclear and Applied Robotics Group</a> at the University of Texas. The plugin takes the image feed from two sources and then applies them onto a 3D Sphere.</p>
<figure class="center">
<img src="/images/360_camera/rviz_textured_sphere_demo.gif" alt="Nuclear Robotics rviz textured sphere demo animation" />
<figcaption>The original rviz_textured_sphere demo. Credit: Nuclear and Applied Robotics Group</figcaption>
</figure>
<p>Since in the first prototype we have used USB cameras we have used standard UVC drivers for ROS. Unfortunately the cameras we used were suboptimal, providing a rectangular image for a circular image view. Because of that we had to post process the images to be square and the fisheye view to appear in the center.</p>
<p>Since we want the 360 camera view to be used by the operator, the hard requirement for us is a latency below 1 second. In this first prototype we saw around 500ms of latency with the 800x600px resolution for a single camera. In the next iteration I’d like to drive the latency down below 300ms while increasing the camera resolution. Stay tuned for the future updates!</p>
<p>As a bonus please find the below picture showing two <a href="https://www.robosynthesis.com/">Robosynthesis</a> robots with heaps of modules being ready for industrial inspection.</p>
<figure class="center">
<img src="/images/360_camera/robosynthesis_robots.jpg" alt="Robosynthesis Robots" />
<figcaption>Modular robots, half of them carrying our 360 camera module</figcaption>
</figure>
<p><a href="https://msadowski.github.io/robosynthesis-3d-camera/">360 camera for industrial inspection with ROS and Robosynthesis robots</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on April 23, 2020.</p>
Robotics ROS https://msadowski.github.io/ardusimple-ros-integration2020-04-05T00:00:00-00:002020-04-05T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>As you might have seen from two of my previous posts (<a href="https://msadowski.github.io/ardusimple-rtk-first-impressions/">1</a>, <a href="https://msadowski.github.io/ardusimple-rtk-moving-forward/">2</a>) I have been doing some testing with ublox F9P based ArduSimple RTK setup. In this blog post I’m describing how I integrated these modules with Robot Operating System (ROS).</p>
<!-- more -->
<h2 id="introduction">Introduction</h2>
<p>Please see my <a href="https://msadowski.github.io/ardusimple-rtk-first-impressions/">first post</a> in this series to learn more about my <a href="https://www.ardusimple.com/">ArduSimple setup</a> the short story is I use simpleRTK2B for my base station and simpleRTK2B+heading for the rover.</p>
<figure class="center">
<img src="/images/ublox/ardusimple_base_rover.jpg" alt="ArduSimple base and rover" />
<figcaption>ArduSimple RTK base (left) and rover (right)</figcaption>
</figure>
<h2 id="configuration">Configuration</h2>
<p>Since I managed to break the whole configuration by overwriting it with the ublox ROS node by using a wrong config, I had to fire up the u-center application. From <a href="https://github.com/ardusimple/simpleRTK2B/tree/master/Configuration_Files">ArduSimple Github Repo</a> I’ve downloaded the following configuration files:</p>
<ul>
<li>srtk2b_base_FW_HPG112.txt - base module</li>
<li>srtk2b+heading_lite_movingbase_FW_HPG112.txt - the heading lite module (the small board), flashed by connecting to the XBEE USB</li>
<li>srtk2b+heading_rover_F9P_FW_HPG112.txt - the main rover module</li>
</ul>
<p>To flash the config files I’ve followed <a href="https://www.ardusimple.com/configuration-files/">instructions on ArduSimple website</a>. When writing the configuration file pay attention to the firmware version and if needed <a href="https://www.ardusimple.com/zed-f9p-firmware-update-with-simplertk2b/">follow these steps to update your modules</a>.</p>
<p>When flashing the ublox configuration following the above guides the checkbox for storing the configuration files was disabled. My assumption at the time was that F9P will automatically store the settings in the memory, however it turned out I was very wrong on that and I’ve noticed this only after switching back and forth between Windows and Linux numerous times.</p>
<p>To save the loaded config you will need to save it in View->Configuration View->CFG, Select all Devices and press the Send button at the bottom of the window.</p>
<figure class="center">
<img src="/images/ublox/config_save.png" alt="u-blox F9P configuration window" />
<figcaption>Saving u-blox F9P configuration</figcaption>
</figure>
<h2 id="u-blox-ros-driver">U-blox ROS driver</h2>
<p>When searching for the ROS driver the most important feature I was looking for was the support of UBX-NAV-RELPOSNED messages that would provide the relative heading between the two rover antennas.</p>
<p>My ROS package of choice that would support F9P became <a href="http://wiki.ros.org/ublox">ublox</a> by <a href="https://www.kumarrobotics.org/">Vijay Kumar Lab</a>. As I noted in the previous section using this package I’ve overwritten the modules configuration by running some of the example launch files in the repository. It took me a while to realise the misconfigurations in the modules but when I did I made sure to have the option <code class="language-plaintext highlighter-rouge">config_on_startup: false</code> in the .yaml file for my rover config. You will find the yaml file that I used in my experiments below:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>debug: 1 # Range 0-4 (0 means no debug statements will print)
device: /dev/ttyACM0
frame_id: gps
config_on_startup: false
uart1:
baudrate: 115200
rate: 1
nav_rate: 1
publish:
all: true
</code></pre></div></div>
<p>The ArduSimple launch file I used looked as follows:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code><launch>
<arg name="node_name" default="gps"/>
<arg name="param_path" default="$(find ardusimple_rover)/config/ardusimple.yaml" />
<arg name="output" default="screen" />
<arg name="respawn" default="true" />
<arg name="respawn_delay" default="30" />
<arg name="clear_params" default="true" />
<node pkg="ublox_gps" type="ublox_gps" name="$(arg node_name)"
output="$(arg output)"
clear_params="$(arg clear_params)"
respawn="$(arg respawn)"
respawn_delay="$(arg respawn_delay)">
<rosparam command="load" file="$(arg param_path)" />
</node>
</launch>
</code></pre></div></div>
<p>After launching the above file I could echo <code class="language-plaintext highlighter-rouge">/gps/fix</code> topic for the GPS messages. When you do that in the RTK setup pay attention to the covariance fields, if they are very low then you most likely have a great view of the sky and the base unit is providing good corrections. To see the relative heading between the two modules you can check the <code class="language-plaintext highlighter-rouge">/gps/navheading</code> topic.</p>
<h2 id="robot-localization">Robot Localization</h2>
<p>In this blog post I wanted to provide the simplest example that you could run just with the RTK hardware and no other sensors. As a test I’ve created a basic setup for robot_localization and navsat_transform_node to create a working TF structure. The only data I was fusing in the EKF was the GPS fix message and the relative heading between the modules.</p>
<p>Because I only have information from GPS this setup does not adhere to <a href="https://www.ros.org/reps/rep-0105.html">REP-105</a> - we don’t have any source of odometry that would be continuous. That’s why you usually use GPS as one of the inputs providing a map->odom transform (more on this in the last section).</p>
<p>There are two nodes that I used here:</p>
<ul>
<li>navsat_transform_node - to produce an odometry message from gps fix</li>
<li>ekf_localization_node - to create a transform between the odom and gps frames</li>
</ul>
<p>You will find all the configuration files that I’ve created in <a href="https://github.com/msadowski/ardusimple_rover">ardusimple_rover repository</a>. There are two things to note here: because the observed rotation between the modules (<code class="language-plaintext highlighter-rouge">/gps/navheading</code> topic) is relative the navsat_transform_node will have an error in orientation. Normally you would use an IMU instead which should produce the correct results. The only two topics that are fused in the ekf are <code class="language-plaintext highlighter-rouge">/gps/fix</code> and <code class="language-plaintext highlighter-rouge">/gps/navheading</code> - in a real setup you will want to use information from your other sensors.</p>
<h2 id="results">Results</h2>
<p>Due to the lockdown I was only able to perform a test in a residential area, driving a car with two rover antennas attached to the roof. I’ve placed the base antenna in a fixed position and let it run for a couple minutes. After manually adjusting the <code class="language-plaintext highlighter-rouge">yaw_offset</code> parameter of navsat_transform_node to align odometry with true gps coordinates I was able to get the following result in <a href="http://wiki.ros.org/mapviz">mapviz</a>:</p>
<figure class="center">
<img src="/images/ublox/mapviz.gif" alt="mapviz gif" />
<figcaption>Mapviz + U-blox ArduSimple in action (4x)</figcaption>
</figure>
<p>Above you can see the following information:</p>
<ul>
<li>blue dots - raw GPS information</li>
<li>green line - filtered odometry</li>
<li>yellow points - location of gps frame in odom frame</li>
</ul>
<p>In the graph below you can see horizontal gps position covariance (top) and the orientation covariance from F9P. As the car moved from an area with a wide sky view to a more urban area you can see an increase in covariance values.</p>
<figure class="center">
<img src="/images/ublox/covariance_graphs.png" alt="mapviz gif" />
<figcaption>Covariance values for the whole test run, courtesy of PlotJuggler</figcaption>
</figure>
<h2 id="next-steps">Next steps</h2>
<p>The setup I described together with the <a href="https://github.com/msadowski/ardusimple_rover">demo repository</a> should get you started on integrating and testing ArduSimple RTK boards (probably anything based on U-blox F9P too). If you wanted to integrate this solution on a mobile robot then you would run two instances of robot_localization:</p>
<ul>
<li>ekf map->odom - fusing information from all of your sources, including RTK + heading from RTK</li>
<li>ekf odom->base_link - fusing only continuous sources of odometry (wheel odometry, IMU)</li>
</ul>
<p>For more information on how to pull this off you can see the <a href="http://docs.ros.org/melodic/api/robot_localization/html/integrating_gps.html">robot_localization wiki</a>.</p>
<p><a href="https://msadowski.github.io/ardusimple-ros-integration/">ArduSimple RTK - ROS integration</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on April 05, 2020.</p>
Robotics ROS https://msadowski.github.io/ardusimple-rtk-moving-forward2020-02-06T00:00:00-00:002020-02-06T00:00:00+01:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>In my <a href="https://msadowski.github.io/ardusimple-rtk-first-impressions/">previous post</a> I was sharing my first impressions from using <a href="https://www.ardusimple.com/">ArduSimple RTK GPS</a>. It’s time to share some information about the small but steady progress I had made since!</p>
<!-- more -->
<h2 id="introduction">Introduction</h2>
<p>In the last post I was testing the RTK in a very static manner by putting the GPS antennas next to the wall of a building. I’m not a fan of sticking electronic boards straight onto any rough surfaces that are unprotected in any way. My first thought on securing the electronics bits while was to 3D print <a href="https://www.thingiverse.com/thing:3551604">this case</a>. The issue with this case is that it’s not high enough to fit the rover module that has an extra GPS board that sits below the XBee radio.</p>
<figure class="center">
<img src="/images/ublox/ardusimple_base_rover.jpg" alt="ArduSimple base and rover" />
<figcaption>ArduSimple RTK base (left) and rover (right)</figcaption>
</figure>
<p>My next thought was that I could perhaps design my own case but instead I decided to design a base plate that can hold the module. If it’s of any use for you and would like to 3D print it then you can grab it <a href="https://github.com/msadowski/msadowski.github.io/blob/master/files/ardusimple_plate.stl">here</a>. You will just need to secure the modules to the plate using 3 M3x8 screws. I recommend choosing screws with as small head as possible - the screw that goes right below XBee module is very close to the pin headers.</p>
<figure class="center">
<img src="/images/ublox/base_plate.jpg" alt="3D printed base plate for ArduSimple RTK modules" />
<figcaption>3D printed base plate for ArduSimple RTK modules</figcaption>
</figure>
<p>Unfortunately the base plate won’t help with any environmental protection therefore behold…</p>
<h2 id="project-boxes">Project Boxes</h2>
<figure class="center">
<img src="/images/ublox/boxes_0.jpg" alt="Shiny new project boxes" />
<figcaption>Shiny new project boxes</figcaption>
</figure>
<p>I figured that purchasing two project boxes will be way faster than designing the case from scratch and I figured I will drill two holes on every side of the box to run the necessary wires through and will be done in no time. Also the these project boxes are (were) IP66 rated.</p>
<figure class="center">
<img src="/images/ublox/boxes_1.jpg" alt="Project boxes with holes" />
<figcaption>Almost done here (don't think they are IP66 anymore though)</figcaption>
</figure>
<p>After cutting the holes in the boxes it was time to place the RTK modules inside:</p>
<figure class="center">
<img src="/images/ublox/boxes_2.jpg" alt="Projext boxes with RTK modules inside" />
</figure>
<p>Do you see any issues here? Me too! Most of the cables are not as flexible as I had hoped and there is simply no way to acces the USB ports on the left side of the modules (the ones I was actually planning to use).</p>
<figure class="center">
<img src="/images/ublox/boxes_3.jpg" alt="Project boxes with RTK modules and antennas attached" />
</figure>
<p>Here are the boxes with antennas attached. You can clearly see that there is no way to get to USB ports in this configuration. Well… sometimes you’ve got to do what you’ve got to do…</p>
<figure class="center">
<img src="/images/ublox/boxes_4.jpg" alt="Completely butchered project boxes" />
<figcaption>From now on you can call me project boxe butcher</figcaption>
</figure>
<p>At this point I’m pretty sure design and printing my own case would be a faster option (mostly due to the lack of proper tools). The good news is this should do the job. If it starts raining in the field the covered box should give me enough time to grab to extract it from the field without destroying the hardware.</p>
<h2 id="lessons-learned">Lessons learned</h2>
<p>Doing a bit of manual hardware work was a good break from all the software I’ve been doing in the past couple of months. Here are some things that I’ve learned and hopefully they are useful for you too:</p>
<ul>
<li>Designign a base plate before I had the boxes - in hindsight I should’ve waited for the boxes to arrive before desiging a base plate. That way I would have more control over the position of the modules within the box and could cut way more precise (and smaller) openings in the boxes</li>
<li>I should’ve ordered a box with a transparent lid that way I could see the RTK status lights without needing to open up the box</li>
<li>When manipulating the XBee antenna sometimes the XBee lifts up, ideally antenna should have any ‘wiggle room’</li>
</ul>
<h2 id="next-steps">Next steps</h2>
<p>I’m happy enough with the state of my project boxes to start running some experiments. Since the ArduSimple RTK modules are based on Ublox ZED-F9P I’m going to start with ROS <a href="https://github.com/KumarRobotics/ublox">ublox package</a>. I just need to set up project box for my Raspberry Pi or… not this time!</p>
<figure class="center">
<img src="/images/ublox/pi.jpg" alt="Raspberry Pi in a case" />
</figure>
<p><a href="https://msadowski.github.io/ardusimple-rtk-moving-forward/">ArduSimple RTK - preparing for testing</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on February 06, 2020.</p>
Robotics https://msadowski.github.io/ardusimple-rtk-first-impressions2020-01-06T00:00:00-00:002020-01-06T00:00:00+01:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p><a href="https://www.ardusimple.com/">ArduSimple</a> provided me with their <a href="https://www.ardusimple.com/product/simplertk2b-heading-basic-starter-kit-ip67/">RTK2B+heading module</a> recently. In this blog post I’m describing my experience running the first tests with these modules and my planned further tests. My end goal is to integrate the modules with ROS and get a precise positioning in 2D with extra heading information.</p>
<!-- more -->
<h2 id="rtk-basics">RTK basics</h2>
<p>They say that a picture is worth a thousand words, so here is a 5 minute YouTube video that should give you basic information on RTK in case you haven’t heard of it before:</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/R0Hry5kR1jY" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>What is special about the module that I’m testing from ArduSimple is the rover board that comes with 2 GPS chips, allowing to plug in two GPS antennas and figuring out the rover heading based on available information.</p>
<h2 id="setup">Setup</h2>
<p>The setup I’ve received from ArduSimple consists of the following:</p>
<ul>
<li><a href="https://www.ardusimple.com/product/simplertk2b-heading-basic-starter-kit-ip67/">simpleRTK2B+heading</a> - rover module</li>
<li><a href="https://www.ardusimple.com/product/simplertk2b-basic-starter-kit-ip65/">simpleRTK2B</a> - base station module</li>
<li>Long range radio module x2</li>
<li>3x <a href="https://www.ardusimple.com/product/ann-mb-00-ip67/">u-blox ANN-MB-00</a> GNSS multiband antenna</li>
</ul>
<figure class="center">
<img src="/images/ublox/ardusimple_base_rover.jpg" alt="ArduSimple base and rover modules" />
<figcaption>ArduSimple base module (left) and rover module (right)</figcaption>
</figure>
<p>The ArduSimple boards are based on <a href="https://www.u-blox.com/en/product/zed-f9p-module">u-blox ZED-F9P</a>, a professional grade multiband GNSS modules.</p>
<p>The long range radio modules that I’ve received are XBee SX 868 - an 863-870 MHz radio modules boasting a line of sight range of 14km.</p>
<figure class="center">
<img src="/images/ublox/u-blox_antenna.jpg" alt="u-blox antenna" />
<figcaption>u-blox ANN-MB-00 antenna</figcaption>
</figure>
<p>The u-blox antennas are IP67 rated have two mounting holes and an embedded magnet that makes it convenient to attach to metal surfaces.</p>
<h2 id="testing">Testing</h2>
<p>One advice from my drone days that stuck with me is to always plug in all antennas before powering the system. That’s what I did when working with this system and I recommend that you do that too.</p>
<figure class="center">
<img src="/images/ublox/u_center.png" alt="u-center software" />
<figcaption>running u-center software</figcaption>
</figure>
<p>At this point I only run some static tests. The first test I performed using <a href="https://www.u-blox.com/en/product/u-center">u-center software</a>, even when I put the antennas outside of my window in an urban space (conditions far from ideal due to <a href="https://www.e-education.psu.edu/geog862/node/1721">multipath</a>) the software was reporting 0.23m accuracy in 2D - not bad at all!</p>
<figure class="center">
<img src="/images/ublox/google_earth.png" alt="GPS position being shown in Google Earth" />
<figcaption>Streaming GPS information to Google Earth</figcaption>
</figure>
<p>Thanks to the advice from the ArduSimple team I was able to display the position on Google Earth (to do that select Menu Bar > File > Database Export > Google Earth server in u-center).</p>
<p>As another experiment I tested the receivers with <a href="https://github.com/ros-agriculture/ublox_f9p">ROS-Agriculture ublox_f9p node</a>. Amazingly it worked out of the box streaming out the position data.</p>
<p>After I’ve performed the tests I’ve found out about <a href="https://github.com/KumarRobotics/ublox">ublox package</a> that seems to implement RELPOSNED message and hence should allow me to stream heading from my rover setup.</p>
<h2 id="summary-and-next-steps">Summary and next steps</h2>
<p>I found the ArduSimple units to be a great piece of kit. I still have quite a bit of work to do to implement them on a robotic platform, here is a rough plan how I’m going to pull this off:</p>
<ol>
<li>Build cases and a base station setup (ideally so that it can be placed on a tripod)</li>
<li>Test out the <a href="https://github.com/KumarRobotics/ublox">ublox package</a></li>
<li>If the package from previous point is not satisfactory: fork ROS-Agriculture ublox_f9p repository and make sure UBX-RELPOSNED message is handled and the driver publishes heading</li>
<li>Capture data by walking around with the module in hand</li>
<li>Integrate the module on my mobile robot development platform and tune the EKF to work with the RTK module</li>
<li>Enable corrections via <a href="https://www.agsgis.com/What-is-NTRIP_b_42.html">NTRIP</a></li>
</ol>
<p><a href="https://msadowski.github.io/ardusimple-rtk-first-impressions/">ArduSimple RTK2B+heading - first impressions</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on January 06, 2020.</p>
ROS Robotics https://msadowski.github.io/iris-lama-slam-with-ros2019-11-29T00:00:00-00:002019-11-29T00:00:00+01:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>If I knew how to draw I would create a header for this blog post that would show a lama getting vaccinated. Since you won’t be seeing it any time soon, let’s jump into <a href="https://github.com/iris-ua/iris_lama">iris_LaMa</a>, a localization and mapping library that works with ROS.</p>
<!-- more -->
<h2 id="setup">Setup</h2>
<figure class="center">
<img src="https://msadowski.github.io/images/dev_platform_2.jpg" alt="Mobile robot" />
<figcaption>Robosynthesis development platform</figcaption>
</figure>
<p>Similarly to my previous experiments (<a href="https://msadowski.github.io/Realsense-T265-First-Impressions/">T265 test</a> and <a href="https://msadowski.github.io/hands-on-with-slam_toolbox/">slam_toolbox evaluation</a>) I’ve run the test described in this post on a <a href="https://www.robosynthesis.com/">Robosynthesis</a> development platform.</p>
<p>The only difference from the previous posts is that I added an IMU, therefore the fused robot odometry should be a tad bit better. Highlights of the setup are as follows:</p>
<ul>
<li>Robosynthesis differential-drive mobile robot development platform</li>
<li>RPLidar A1 (hobby grade, scanning at ~7Hz)</li>
<li>Onboard computer running ROS Melodic and <a href="https://github.com/iris-ua/iris_lama">iris_lama</a> (commit: 07808d87) and <a href="https://github.com/iris-ua/iris_lama_ros">iris_lama_ros</a> (commit: 54df5359)</li>
<li>Pixhawk autopilot used as an IMU</li>
</ul>
<h2 id="first-lama-impressions">First LaMa impressions</h2>
<p>LaMa is the library developed by Intelligent Robotics and Systems (IRIS) Laboratory at the University of Aveiro. The package is maintained by <a href="https://github.com/eupedrosa">Eurico F. Pedrosa</a> and is open sourced under BSD 3-Clause licence.</p>
<p>Getting started with LaMa was very simple - pull the two repositories, build them, change couple of parameters (scan_topic, global_frame_id, odom_frame_id, base_frame_id) to match my platform and run it.</p>
<p>There are 3 nodes of interested in the ROS package:</p>
<ul>
<li>slam2d_ros - online SLAM</li>
<li>loc2d_ros - localization in the known map</li>
<li>pf_slam2d_ros - particle filter slam</li>
</ul>
<p>In this blog post I’ll focus on the first two nodes.</p>
<h3 id="online-slam">Online SLAM</h3>
<figure class="center">
<img src="/images/ros_slam/lama_map.png" alt="Map created with LaMa" />
<figcaption>SLAM with LaMa</figcaption>
</figure>
<p>The first impression I got using this SLAM package was that it “just works”. Out of the box without tuning any parameter values except for the frame name changes the SLAM seemed to perform very well. For the full parameter list you can see the <a href="https://github.com/iris-ua/iris_lama_ros/blob/master/README.md">project readme</a>. I found that the number of parameters you can tune is smaller than in case of other packages I used so far (looking at you Cartographer). The good thing about it is that you don’t need to spend so much time tuning the parameters to get good results, on the other hand you might not have so much flexibility.</p>
<h3 id="localization">Localization</h3>
<p>With the localization node you can start a map_server and the localization node will work with the map coming from the server. Currently the node doesn’t support global localization (as per Readme.md) but instead you can send the robot pose on the <code class="language-plaintext highlighter-rouge">/initialpose</code> topic (this means that if you set the pose estimate from RVIZ it will work too). I found that even if you set the pose ~1-2m from the real robot pose it converged to the true pose as the robot is moving around.</p>
<h2 id="apples-to-oranges">Apples to oranges</h2>
<p>Since I already covered slam_toolbox in one of my <a href="https://msadowski.github.io/hands-on-with-slam_toolbox/">previous posts</a> I’ve decided I would try to see how they compare. Do you think arbitrary comparisons of two SLAM results is meaningless? Me too! But I’m going to do that anyway.</p>
<iframe width="560" height="315" src="https://www.youtube.com/embed/Cgcl3LcFnEs" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen=""></iframe>
<p>Above you will find a video comparison of online SLAM with iris_lama compared to slam_toolbox. I run the two solutions with default configs and only changed the platform related settings (mostly the names of frames and LiDAR range). Here are some of my observations on LaMa:</p>
<ul>
<li>Visually the position estimate of LaMa looks a bit smoother (less discrete jumps)</li>
<li>Erroneous readings from my LiDAR are added to the map in LaMa (you can see points appearing way outside of the map)</li>
<li>The very small features seem to be missing from the map (you can see that well in the upper right corner of the map)</li>
</ul>
<p>It’s quite possible that some of the things above can be tuned through parameters, however I didn’t find any obvious parameters described in the readme file.</p>
<p>Another thing I wanted to know is how does the position estimates compare. For this reason I have used the <a href="https://michaelgrupp.github.io/evo/">evo Python Package</a> and described my workflow in full in my <a href="https://msadowski.github.io/Comparing-SLAM-with-ROS-evo/">previous post</a>. In the screenshots below I have used the output from iris_lama as a ground truth:</p>
<figure class="half">
<img src="/images/ros_slam/rpe_3.png" />
<img src="/images/ros_slam/rpe_4.png" />
<figcaption>Relative Pose Error results on my data</figcaption>
</figure>
<p>What these images tell us is that most of the time the output of both packages is quite close to one another (you can right click on the image to see it in full size). The issues with this comparison is that without ground truth we can’t say anything more about it, which brings me to the next section.</p>
<h3 id="cpu-usage">CPU usage</h3>
<p>I think the biggest advantage of iris_lama is its low CPU usage (In the project readme author says that it runs great on Raspberry Pi 3B+). Here is the comparison of CPU usage of iris_lama (left) and slam_toolbox (right).</p>
<figure class="half">
<img src="/images/ros_slam/lama_cpu.png" />
<img src="/images/ros_slam/slam_toolbox_cpu.png" />
<figcaption>CPU usage comparison between the two packages</figcaption>
</figure>
<p>You will see that the LaMa CPU usage peaks at around 15% and slam_toolbox at 450%. I hope that one day I will get to repeat this experiment in a larger environment and see how both packages manage it. I grabbed the CPU usage of both packages using <a href="https://github.com/pumaking/cpu_monitor">cpu_monitor</a> package and running it along the bag file with raw data and each SLAM package. Then I plotted the output using <a href="https://github.com/facontidavide/PlotJuggler">PlotJuggler</a>.</p>
<h2 id="links">Links</h2>
<p>Here are some links that can provide you with further reading or allow you to replicate my results.</p>
<ul>
<li><a href="https://discourse.ros.org/t/announcing-lama-an-alternative-localization-and-mapping-package/10916">ROS discourse discussion</a> on LaMa</li>
<li><a href="https://drive.google.com/file/d/1GLs5PdKEzpkgN3aeGtPtEOW46JrIaXBr/view?usp=sharing">A bagfile with raw data I used for this blog post</a></li>
</ul>
<h2 id="next-steps">Next steps</h2>
<p>I’ve recently received a <a href="https://www.ardusimple.com/product/simplertk2b-heading-basic-starter-kit-ip67/">simpleRTK2B+heading</a> module from <a href="https://www.ardusimple.com/">ArduSimple</a> which should get me sub centimeter level position accuracy. This should be a decent source of ground truth for my follow up experiments. Stay tuned for more information in the next post!</p>
<p><a href="https://msadowski.github.io/iris-lama-slam-with-ros/">Giving LaMa a shot</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on November 29, 2019.</p>
ROS Robotics https://msadowski.github.io/Comparing-SLAM-with-ROS-evo2019-11-21T00:00:00-00:002019-11-21T00:00:00+01:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>While working on another blog post I fell into a rabbit hole of comparing SLAM packages using ROS. This blog post briefly describes how I managed to compare the results of two SLAM packages by using <a href="https://michaelgrupp.github.io/evo/">evo Python package</a>.</p>
<!-- more -->
<h2 id="background">Background</h2>
<p>Recently I’ve started working a lot with various SLAM packages in ROS using the <a href="https://www.robosynthesis.com/">Robosynthesis</a> dev platform. I figured that opening RVIZ, displaying the robot model in the map frame and saying “yeaah, that looks about right” might not be necessairly the best way to evaluate the quality of SLAM.</p>
<p>What my data lacks is groundtruth as at the moment I don’t have a way to capture it. This post describes the way to evaluate output of two SLAM methods against one another.</p>
<h2 id="evo">evo</h2>
<p>I found evo, while doing research for my newsletter Weekly Robotics and featured it in <a href="https://weeklyrobotics.com/weekly-robotics-58">issue #58</a>. As per the description on <a href="https://michaelgrupp.github.io/evo/">project website</a>:</p>
<blockquote>
<p>This package provides executables and a small library for handling, evaluating and comparing the trajectory output of odometry and SLAM algorithms.</p>
</blockquote>
<p>The thing most interesting for us in the scope of this blog post is that it supports bag files!</p>
<h2 id="gathering-the-data">Gathering the data</h2>
<p>To gather the data I run the robot around my office with all my mapping stack running and made sure that the robot surveyed enough area to build a good map. The next step was to run 2 SLAM packages using the data from my .bagfile. To do it correctly you need to ensure your bag contains only the data of interest, hence rosbag filter comes handy:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>rosbag filter slam.bag bag_filtered.bag "topic == '/clock' or topic == '/rr_robot/joint_states' or topic == '/odometry/filtered' or topic == '/scan' or (topic=='/tf' and m.transforms[0].header.frame_id != 'map' or topic=='/tf_static')"
</code></pre></div></div>
<p>The most important things we need from the bag are clock, laser scan topic and the whole tf structure except the map->odom transform if your setup follows REP-105.</p>
<p>After these steps we shold have a set of raw data that we can use to run some SLAM packages from the recorded input. The main reason I find it useful is that it allows tuning the parameters to some extend and rapidly visualise the change in quality.</p>
<h2 id="rebagging-the-data">Rebagging the data</h2>
<p>The next step that I performed was to run the slam package of choice with the recorded data. Couple of pointers:</p>
<ul>
<li>Make sure to set use_sim_time parameter to true</li>
<li>Run your bagfile with –clock argument</li>
<li>Visualise everything in rviz to make sure your data is working correctly. I advise displaying the map and pose of the robot</li>
</ul>
<p>When running the SLAM node make sure that you record the bag file (let’s call it run_1.bag) of your run, this time record all the data as we won’t be filtering this again.</p>
<p>After you have grabbed the run_1 you can re-do the experiment with another package and create run_2.bag file.</p>
<h2 id="extracting-the-data">Extracting the data</h2>
<p>Evo currently works only with geometry_msgs/PoseStamped, geometry_msgs/TransformStamped, geometry_msgs/PoseWithCovarianceStamped and nav_msgs/Odometry topics. That means that in case of most SLAM packages that I used to date you can’t use it directly as most of them provide map->odom tf transform.</p>
<p>Luckily evo repository contains <a href="https://github.com/MichaelGrupp/evo/blob/master/contrib/record_tf_as_posestamped_bag.py">a file</a> we can use for our needs. In short this script will look at the bag file and will create a pose topic between two frames of interest. This is exactly what we need now and we will run it twice on our run_*.bag files:</p>
<p>WARNING! The methods below are destructive, make sure you keep copies of your original bag files.</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>./record_tf_as_posestamped_bag.py --lookup_frequency 15 --output_topic run_1_pose --bagfile run_1.bag map base_link
</code></pre></div></div>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>./record_tf_as_posestamped_bag.py --lookup_frequency 15 --output_topic run_2_pose --bagfile run_2.bag map base_link
</code></pre></div></div>
<p>The above scripts assume that your map frame is called ‘map’ and the base_link is a reference point on your robot and is called ‘base_link’. Running the script our bag file will only contain a single topic named run_*_pose.</p>
<h2 id="merging-bags">Merging bags</h2>
<p>At the moment it is not possible to run evo with two bagfiles as input therefore we will need to merge them. The way I did it was by running <a href="https://gist.github.com/troygibb/21fec0c748227eec89338054e6dd1833">this script</a>:</p>
<div class="language-plaintext highlighter-rouge"><div class="highlight"><pre class="highlight"><code>./bagmerge.py run_1.bag run_2.bag -o merged.bag
</code></pre></div></div>
<h2 id="working-with-evo">Working with evo</h2>
<p>Having the data with two topics (run_1_pose and run_2_pose) we can now get some metrics on our solutions using commands like:</p>
<ul>
<li>‘evo_ape’ - absolute pose error</li>
<li>‘evo_rpe’ - relateive pose error</li>
</ul>
<p>You would run them as follows: <code class="language-plaintext highlighter-rouge">evo_rpe bag merged.bag run_1_pose run_2_pose --plot</code> and you would see the following output:</p>
<figure class="half">
<img src="/images/ros_slam/rpe_1.png" />
<img src="/images/ros_slam/rpe_2.png" />
<figcaption>Relative Pose Error results on my data</figcaption>
</figure>
<p>What I’m showing here is probably a tip of the ice berg, for more information about evo please see <a href="https://github.com/MichaelGrupp/evo/wiki">the wiki</a>.</p>
<h2 id="results-discussion">Results discussion</h2>
<p>The most important thing I’m lacking in my comparison is the groundtruth. Without it we can’t be certain which of the results is more correct but if you had an access to a system that could provide a groundtruth this approach should be working very well and you should be able to use groundtruth straight out of the box.</p>
<p>I think that the process I described in this blog post is a bit complex at the moment but it should be quite easy to streamline it. If you have some ideas how to improve it let me know!</p>
<p><a href="https://msadowski.github.io/Comparing-SLAM-with-ROS-evo/">Comparing SLAM results using ROS and evo</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on November 21, 2019.</p>
ROS Robotics https://msadowski.github.io/Realsense-T265-First-Impressions2019-10-15T00:00:00-00:002019-10-15T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>In this post I will write about my first impressions after working with Realsense T265 on a wheeled mobile robot and give some tips on the configuration I think is most correct with respect to ROS standards.</p>
<!-- more -->
<h2 id="setting-up">Setting up</h2>
<figure class="center">
<img src="https://msadowski.github.io/images/dev_platform.jpg" alt="Mobile robot" />
<figcaption>Robosynthesis development platform</figcaption>
</figure>
<p>I’ve tested the T265 on a Robosynthesis Dev Platform that you might have seen it in my <a href="https://msadowski.github.io/hands-on-with-slam_toolbox/">previous post</a>. While working with the T265 tracking camera I spent a fair bit of time going through the documentation and ROS package source code, hope that any of the insights I describe in this post will help you get started.</p>
<p>There are two sets of software we will be interested in in this blog post:</p>
<ul>
<li>realsense-ros package (<a href="https://github.com/IntelRealSense/realsense-ros">GitHub</a>) (while working on this post I was on da4bb5d commit hash)</li>
<li>librealsense (<a href="https://github.com/IntelRealSense/librealsense">GitHub</a>)</li>
</ul>
<p>There are some inconsistencies between the above packages and I’ll do my best to clear them up in this post. Please note that the T265 seems to be in an active state of development therefore some of the information contained in this post might change over time (I’ll do my best to keep it up to date though).</p>
<h2 id="coordinate-frames">Coordinate frames</h2>
<p>Coordinate frame setup is something I had the most issues with when I first started with T265 with realsense-ros package. First let’s look at the T265 frame (as seen in <a href="https://github.com/IntelRealSense/librealsense/blob/master/doc/t265.md">librealsense</a>).</p>
<figure class="center">
<img src="https://msadowski.github.io/images/T265_sensor_extrinsics.png" alt="T265 VR coordinate frames" />
<figcaption>T265 coordinate frames</figcaption>
</figure>
<p>T265 uses a VR coordinate system which differs from the one you would expect in ROS. Fortunately the realsense-ros handles the tf transforms for us, making our lives a bit easier. Here is what we need to keep in mind:</p>
<ul>
<li>Any frames with a word <strong>optical</strong> in them (e.g. camera_fisheye_optical_frame) are the same frames are the T265 frames in VR coordinates</li>
<li>The camera and IMU messages are delivered in the optical frames</li>
<li>The parameter pose_frame_id in the realsense-ros <strong>differs</strong> from the librealsense pose frame. Its orientation follows ROS convention (x-front, y-left, z-up in the global frame)</li>
<li>pose_frame_id in realsense-ros is the location of the camera in the odometry frame. There is a static transform between the pose_frame_id and the base_frame_id of the camera (not to be mistaken with the base_frame in the ROS traditional sense)</li>
</ul>
<p>Below you can see an example ROS tf tree that the realsense-ros can provide for us:</p>
<figure class="center">
<img src="https://msadowski.github.io/images/t265_tf_tree.png" alt="Realsense T265 tf tree" />
<figcaption>slam_toolbox RViz plugin window</figcaption>
</figure>
<p>You should be able to get the same output by setting the following parameters in the launch file (pseudocode):</p>
<ul>
<li>camera = “rs_t265”</li>
<li>tf_prefix = “$(arg camera)”</li>
<li>publish_odom_tf = “true”</li>
<li>odom_frame_id = “odom”</li>
<li>base_frame_id = “$(arg tf_prefix)_link”</li>
<li>pose_frame_id = “$(arg tf_prefix)_pose_frame”</li>
</ul>
<h3 id="shortcomings">Shortcomings</h3>
<p>There are two things to note about the above tf structure (most of them coming from <a href="https://www.ros.org/reps/rep-0105.html">REP-105</a>)</p>
<ul>
<li>A tf frame can have only one parent. This means that if you have a base_link frame specified in your robot description you won’t be able to directly specify a transform for base_link->rs_t265_link or base_link->rs_t265_pose_frame as this would break your tree</li>
<li>By convention the measurements in the odometry frame have to be continuous (without discrete jumps) this means that if you were to use the setup described above then you would need to set “enable_pose_jumping” parameter to false (<a href="https://github.com/IntelRealSense/realsense-ros/issues/923#issuecomment-531547065">GitHub issue</a>). More on this later</li>
</ul>
<h2 id="the-most-ros-proper-setup-i-can-think-of">The most ROS-proper setup I can think of</h2>
<p>Here are some of the considerations for creating a most proper setup with ROS package for Intel Realsense T265 that I can think of.</p>
<p>First of all, I like my sensor frames being relative to the base_link frame of my robot platform. Therefore in my urdf description I would define a static base_link to the camera pose_frame (the ROS one). To have this working we need to set “publish_odom_tf” parameter to false (this way we ensure that the camera pose_frame has a single parent, a base_link).</p>
<p>Say we would like to use the camera as a source of odometry. My suggestion for getting there is to:</p>
<ul>
<li>Run a <a href="http://docs.ros.org/melodic/api/robot_localization/html/index.html">robot_localization node</a> that will listen to the odometry message from T265 and publish an odom->base_link transform. That way we can easily ensure that the tree is continuous</li>
<li>Set enable_pose_jumping parameter to false so that the pose of the robot in odometry frame is continuous (<strong>WARNING</strong> read this section until the end before implementing it since it might cause significant errors). Some of the packages you might be using might make some assumptions following REP-105 so better safe than sorry</li>
<li>Optional: create a t265 filter node that will change the odometry frame_id from the pose_frame to base_link. That way you should be able to directly compare the various odometry sources on your platform (at least comparing the wheel odometry against the t265 odometry sounds like an interesting experiment)</li>
</ul>
<p>I successfully run some tests with the first two points from the above list and I was quite satisfied with how ‘clean’ the setup was w.r.t. ROS good practices. Because of <a href="https://github.com/IntelRealSense/librealsense/issues/4876">velocity drift</a> that I observed on multiple occasions in my setup I stopped looking into it.</p>
<h2 id="wheel-odometry">Wheel odometry</h2>
<p>According to <a href="https://github.com/IntelRealSense/realsense-ros#using-t265">the docs</a> T265 absolutely requires wheel odometry for robust and accurate tracking. To provide the odometry information you will need to:</p>
<ol>
<li>Specify the <em>topic_odom_in</em> parameter</li>
<li>Create a file with odometry calibration</li>
</ol>
<p>The first requirement is trivial; you just need to make sure that you correctly specify the topic name.</p>
<p>The fields you need to fill in in the calibration file are T and W vectors in extrinsics field (you will find an example in <a href="https://github.com/IntelRealSense/librealsense/pull/3462#issuecomment-472491730">this comment</a>). What you need to keep in mind is that T is the translation from camera pose frame (in the sensor/VR frame, <strong>not the realsense-ros pose frame</strong>) to your base_link and W is the rotation between these two frames in <strong>axis-angle representation</strong>. You will find some useful information about this in <a href="https://github.com/IntelRealSense/librealsense/pull/3462">this pull request</a>.</p>
<h2 id="closing-thoughts">Closing thoughts</h2>
<p>I really like the idea behind the Realsense T265. Having an affordable sensor that can be easily integrated onto any robot would be a great thing to have. I think the T265 is going in the right direction, however I would let it mature before using it on a commercial system but I think it will get there and will provide a true ‘plug&play’ ROS experience, adhering to good ROS practices.</p>
<p>Is there anything that I missed? Your feedback is highly valued so feel free to leave a comment! I’ll be following the T265 development and try to update this post as needed.</p>
<h2 id="update">Update</h2>
<p>It has been almost 1 year since I wrote this blog post. I was recently looking to integrate it on a robot. Unfortuantely there are two problems that prevented me from using it on the robot:</p>
<ul>
<li><a href="https://github.com/IntelRealSense/librealsense/issues/4518">Pose data is NaN</a></li>
<li><a href="https://github.com/IntelRealSense/librealsense/issues/5850#issuecomment-688791413">Odometry drift when PoseJumping is disabled</a></li>
</ul>
<p>As you will see in the comments Intel is not planning to address any existing or new issues on T265 as they shifted focus to the new products. If you are making robots that have to run over long period of times with high accuracy I would advise to wait for the new products to come out before investing in T265.</p>
<p><a href="https://msadowski.github.io/Realsense-T265-First-Impressions/">Intel Realsense T265 tracking camera for mobile robotics - first impressions</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on October 15, 2019.</p>
ROS Robotics https://msadowski.github.io/hands-on-with-slam_toolbox2019-09-18T00:00:00-00:002019-09-18T00:00:00+02:00Mateusz Sadowskihttps://msadowski.github.io[email protected]
<p>In the past couple of weeks, as part of a project with <a href="https://www.robosynthesis.com/">Robosynthesis</a>, I’ve been exploring <a href="https://github.com/SteveMacenski/slam_toolbox">slam_toolbox</a> by <a href="https://github.com/SteveMacenski">Steven Macenski</a>. This post summarizes my experience so far.</p>
<!-- more -->
<p>Here is the description of the package taken from the project repository:</p>
<blockquote>
<p>Slam Toolbox is a set of tools and capabilities for 2D SLAM built by Steve Macenski while at Simbe Robotics and in his free time.
This project contains the ability to do most everything any other available SLAM library, both free and paid, and more. This includes:</p>
<ul>
<li>Ordinary point-and-shoot 2D SLAM mobile robotics folks expect (start, map, save pgm file) with some nice built in utilities like saving maps</li>
<li>Continuing to refine, remap, or continue mapping a saved (serialized) pose-graph at any time</li>
<li>life-long mapping: load a saved pose-graph continue mapping in a space while also removing extraneous information from newly added scans</li>
<li>an optimization-based localization mode built on the pose-graph. Optionally run localization mode without a prior map for “lidar odometry” mode with local loop closures</li>
<li>synchronous and asynchronous modes of mapping</li>
<li>kinematic map merging (with an elastic graph manipulation merging technique in the works)</li>
<li>plugin-based optimization solvers with a new optimized Google Ceres based plugin</li>
<li>RVIZ plugin for interating with the tools</li>
<li>graph manipulation tools in RVIZ to manipulate nodes and connections during mapping</li>
<li>Map serialization and lossless data storage</li>
<li>… more but those are the highlights</li>
</ul>
</blockquote>
<h2 id="our-test-setup">Our test setup</h2>
<p>Here are the highlights of our setup:</p>
<ul>
<li>Robosynthesis differential-drive mobile robot development platform</li>
<li>RPLidar A1 (hobby grade, scanning at ~7Hz)</li>
<li>Onboard computer running ROS Melodic and slam_toolbox (commit: 9b4fa1cc83c2f)</li>
</ul>
<figure class="center">
<img src="https://msadowski.github.io/images/dev_platform.jpg" alt="Mobile robot" />
<figcaption>Robosynthesis development platform used for slam_toolbox evaluation</figcaption>
</figure>
<h2 id="first-impression">First impression</h2>
<p>The first test we’ve run was with the default parameters that come with slam_toolbox with minor changes (frame names, and max laser range).</p>
<p>In ROS, as a good practice, we usually have a TF tree setup in the following way (at least as a minimum when doing SLAM):</p>
<p><strong>map -> odom -> base_link</strong></p>
<p>If you would like to know more about the transforms then <a href="https://www.ros.org/reps/rep-0105.html">REP-105</a> is your friend. Sometimes we run a competition in the office who can recite it faster!</p>
<p>Our expectation of slam_toolbox is to provide us with a <strong>map -> odom</strong> transform. In our tests we’ll use <strong>odom -> base_link</strong> transform from wheel odometry. Is it going to drift? Yes, it will! But slam_toolbox will have our back!</p>
<p>Here is a short gif showing our first test, driving the robot at a reasonable speed (at least for an indoor robot) around an office:</p>
<figure class="center">
<img src="https://msadowski.github.io/images/slam_toolbox_odom.gif" alt="slam_toolbox mapping" />
<figcaption>Performing SLAM with slam_toolbox (replay with 3x rate)</figcaption>
</figure>
<p>And if you’d like to see some of the raw data used during the above session then you can download the bag file <a href="https://drive.google.com/file/d/1S6ceDqPf1Z_5Pq49td9y3ZL9Jpdyc1Ok/view?usp=sharing">here</a>.</p>
<h2 id="taking-it-further">Taking it further</h2>
<p>What you’ve seen in this blog post are only the first trials with this SLAM package. If you take a look at the <a href="https://github.com/SteveMacenski/slam_toolbox/blob/melodic-devel/config/mapper_params_online_async.yaml">configuration files</a> you will see that there are lots of parameters that can be tuned.</p>
<p>A very helpful tool that comes with slam_toolbox is the RViz plugin (Panels->Add New Panel->slam_toolbox->SlamToolboxPlugin)</p>
<figure class="center">
<img src="https://msadowski.github.io/images/slam_toolbox_rviz.png" alt="Rviz plugin" />
<figcaption>slam_toolbox RViz plugin window</figcaption>
</figure>
<p>This blog post is only the beginning of our adventure with slam_toolbox but we’ve liked it so much that we decided to share the results with you. Stay tuned for more information about the hardware and open source that we use!</p>
<p><a href="https://msadowski.github.io/hands-on-with-slam_toolbox/">Hands on with slam_toolbox</a> was originally published by Mateusz Sadowski at <a href="https://msadowski.github.io">msadowski blog</a> on September 18, 2019.</p>
ROS Robotics