Skip to main content

Gypsy, The Software, T-24 hours…

While Lil’ Joe sports a fairly simple program, Gypsy ended up as one of the more complex programs I’ve written in NXT-G. The program (still named “Nadar”) really needed to handle several different tasks at once. Primarily a photographic platform, it had to be able to operate the camera in both still & video modes, and take interesting pictures while controlling the platform. As a secondary goal, it needed to datalog the numerous environmental sensors hooked up, as well as “engineering data” (like battery voltage, platform angle, motor noise, etc.). On top of this, it needed to monitor and control the payload temperature, critical to keeping the payload alive. And to complicate all of this, the program needed to be as small as possible, leaving as much memory as possible for datalogging.

In order to handle the photographic challenge, the program uses the same approach as the original Nadar: following a script. It reads a series of timed commands from a text file (think of it as a “photographic program” that Gypsy follows), executing each command (“take picture”, “turn on video”, “tilt platform”, etc.) and then waits a specified time until it reads & executes the next command in the file. This way the programming code for, say, “turn on the camera, take a picture, and turn off the camera” only needs to appear once in the program – it is just invoked every time it’s needed, saving huge amounts of memory over a simple “string of Motor blocks” style program (for comparison, several hundred still photos and five videos, with numerous repositionings of the pitch of the platform, take only 0.4 k of the NXT memory when stored as such a “script file”).

To deal with the simultaneous issue of datalogging, the program uses a similar system of “timed events”. The same loop that handles waiting for the next timed camera command also checks if, for instance, two seconds has elapsed since the last pressure & temperature logging – if it has, it logs the current pressure/temperature, and resets this timer as well. This same process occurs (with varying timed intervals) for engineering data, heater checks, etc. There’s no reason to do these in multiple parallel sequences… after all, most of the time all these would be just sitting around “waiting” for the right time to elapse. So instead, they all sit in the same loop being checked in series. This means the program can execute faster as well. The internal temperatures and the state of the internal heaters are also handled by this “timed event” strategy, so the program is fairly easy to troubleshoot – most of the sections are working the same way.

There’s a lot of data to record as well: ambient pressure, three different temperature sensors, external light level, the angle (or pitch) of the platform, NXT battery voltage, when each picture was taken (& if it was a picture or video), when the platform angle is changed (& how loud the motors sound during this event, as well as the background noise level, and how loud the NXT “Beep” sounds at this point as well), the two-second average acceleration and sound level, etc. To put as much as possible into the datalog, all this information is “compressed”, using a very simple data compression scheme coded up in NXT-G, that packs information about twice as dense as a text file “normally” would. The bit that does the datalogging and data compression is all in a self-contained My Block to make things easier – you wire in the information, and it handles writing it to the file, opening a new file if the old one is full or flawed, timestamping it, and compressing everything.

On top of all this, the program can switch to a different photoscript just by changing a single variable. Taking advantage of this, when the program detects a free-fall event (like when the balloon bursts), it will switch photographic scripts, as well as change the frequency of datalogging certain variables, so it can adjust its behavior under its own control during the mission (hey, otherwise it’s not a robot, right?).

How big is this complex, multi-tasked program? Remember, I need to save as much space as possible for datalogging all the information from this multi-hour mission as well, so size is critical. Well, it turns out on the NXT it occupies only about 20k – under 20% of the available memory. The “secret” (if that’s the right word) is that I simply reuse code – I do not follow the more common “string of Motor blocks” style that often gets used in many NXT-G program I’ve seen. The result is a program with 25 different My Blocks, interconnected and repeatedly calling each other on multiple levels, instead of 250+ individual Motor (and other) blocks, each of which I would have to set up, and each of which would take up memory. The result may look complex when diagrammed out, but it really isn’t… at least no more so than a "normal" NXT-G program. After all, you don't have to understand what LabVIEW routines are called when you use a certain NXT-G block, or what machine language commands each of those LabVIEW commands is translated to later. In short, I’ve just followed the example set forth by the folks who designed the software – make it modular, and built on things you already know work. The result was the program was “written” in about two days, with about a week of testing (because each testing run had to last something like 2-4 hours).

Please note this is not because I’m a genius, or that I’ve deep insight into the minds of programmers (I’ve never had a programming class in my life), or anything like that. I’m just following where a modular language like NXT-G leads me… and curiously, it’s led me to write code that is far more modular and compact than I ever ended up doing in text-based languages.

OK, that’s it – tomorrow is the launch, and we’ll see what works (hardware, software, etc.)… and what doesn’t. Cross your fingers… and watch the tracking links on the HALE website to see where these payloads are going "as it happens". I'll also be blogging on this tomorrow during the actual mission.

--
Brian Davis

Popular posts from this blog

MINDSTORMS Retires!

2023 is the 25th Anniversary of the MINDSTORMS brand. For 25 years, MINDSTORMS has educated and inspired a generation of robot builders, both children and adults. Unfortunately, the LEGO Group decided to end the line on December 2022. Many ROBOTMAK3RS have been passionately involved with the development of MINDSTORMS through the MUP and MCP programs. Even with the newest Robot Inventor line, several ROBOTMAK3RS were invited to submit additional bonus models that were included in the official app. Regardless of the retirement of a major LEGO robotics product line, ROBOTMAK3RS continue to MAKE-SHARE-INSPIRE using all LEGO robotics platforms available to us. Here is the official statement from LEGO. Since its launch in September 1998, LEGO MINDSTORMS has been one of the core ‘Build & Code’ experiences in the company’s portfolio, carrying with it significant brand equity and becoming a stand-out experience for the early days of consumer robotics and leading to current Build & Code

Celebrating 25 Years of MINDSTORMS

In celebration of the 25th Anniversary of MINDSTORMS, we take a trip through history. Please also visit ROBOTMAK3RS Community every week as we highlight different projects all through 2023 in celebration of the anniversary. Some of the early history is based on the content shared by  Coder Shah  in our  MINDSTORMS EV3 Community Group . Some of the text and links may have been edited from his original posts for consistency and clarity.  1984 - Kjeld Kirk Kristiansen watched a TV program called "Talking Turtle," where MIT professor Seymour Papert demonstrated how children could control robot "turtles" using LOGO, a programming language he developed. 1988 - The collaboration between MIT and LEGO resulted in LEGO TC Logo in 1988, which allowed students to control LEGO models using computer commands. The video shows Papert demonstrating TC Logo. 1990 - LEGO TC Logo was hampered since the robots you built had to be tethered to a personal computer. LEGO and MIT

Celebrating MINDSTORMS with a Remix Part 1

In honor of the 25th Anniversary of MINDSTORMS, we asked ROBOTMAK3RS to combine a LEGO set of their choice with a MINDSTORMS set. Here is what these five ROBOTMAK3RS came up with.  MINDSTORMS Chess Assistant by Arvind Seshan Overview: When you are new to chess, it can be a challenge to remember which pieces go where. Now, you can use machine learning and LEGO MINDSTORMS Robot Inventor to build a tool to help you learn where all the chess pieces go on the chess board. Sets used: LEGO® Iconic Chess Set (40174) and MINDSTORMS Robot Inventor (51515) Review: I really like how the chess set base can store all the pieces underneath and that the board neatly splits in half for handy storage. The chess pieces themselves are very sturdy and well built. My only criticism is the building of the box itself. It was quite difficult to see what pieces to use and since the entire box is made mostly of thin plates, it took a lot of time and patience. I would have liked the storage area to be sliding dra