Building Open APS: a low-cost traffic signal for low-vision and blind pedestrians using a Glowforge…

Written by cgroom | Published 2017/10/27
Tech Story Tags: accessibility | new-york-city | open-data | prototyping | raspberry-pi

TLDRvia the TL;DR App

In this post, I’ll discuss the design considerations and technical approach for a prototype device and submission for the New York City / Barcelona challenge Call for Innovation — Enhancing Mobility for Blind and Low Vision Pedestrians.

I worked with Thomas Logan (my brother-in-law; the founder of Equal Entry, an accessibility consulting company) to demonstrate that it is possible to build an Accessible Pedestrian Signal (“APS”; these are the beeping boxes you find at some crosswalks) with greater functionality than the current New York City implementation, for a fraction of the current cost. Our Open APS design uses public data, an open-source approach, and commodity consumer tools — we built our prototype using a Glowforge laser cutter and a Raspberry Pi.

Today, NYC spends an average of $43,060 per intersection for 8 APS devices (2 per corner). The installation costs are $985/pole; this implies that the remaining per-unit cost is $4465. By comparison, our cost of goods is just $66. By decreasing the per-intersection rollout costs by orders of magnitude, our hope is that the city can more quickly make streets safer for everyone.

In this post I’ll cover our proposal; design considerations for accessibility (low-vision, blind, color-blind, and deaf-blind people); tips for using a Glowforge laser cutter for accessibility prototyping; and attaching a Raspberry Pi.

Disclaimer: I’m a minor angel investor in Glowforge.

Open APS Proposal

NYC has Accessible Pedestrian Signals (APS) at various intersections around the city. Here’s one:

There’s no doubt these signals help blind and low-vision people more safely cross intersections. While we applaud the city for installing these, we have two criticisms:

  1. They are few and far between. This is because the city can only afford to add these to 75 intersections a year. We believe lower costs would translate to a larger-scale deployment.
  2. They don’t do much. They tell you when the light is green; that’s it. We believe these devices could give additional navigation and safety guidance through improved visual signage, tactile signage, and voice.

Thomas and I propose providing a free and open service for generating all the necessary asserts (signage, audio, software) to create an APS tailored to any particular crosswalk; hence the name “Open APS.” Hardware like a CNC milling machine can be used to create customized signs, and the software and audio would be included in a ready-to-use flash image for a Raspberry Pi (or any cheap computer).

To prove this, we built a working prototype using $66 of parts and a Glowforge laser cutter.

Prototype Open APS for the North-East corner of 7th Ave and W 23rd St, crossing W 23rd St.

We envision a workflow like:

  1. A city engineer goes to our website, and types in the intersection and crosswalk they are interested in.
  2. We will use the public street centerline dataset to lookup the information about the crosswalk (street width, bike lanes, and so on).
  3. The engineer validates this data, then clicks to continue.
  4. We generate a .zip file containing all the assets needed to produce this APS device. This includes SVG files for a CNC machine, and an ISO image for a Raspberry Pi that includes pre-generated text-to-speech audio and necessary software.
  5. Commodity hardware is used to produce and assemble these devices.

The Open APS would need power (~3 Watts) and a hard-wired connection to the walk signal (to ensure safety, the actual signal light should be the source of truth). This device could operate in a reduced-functionality “orientation-only” mode if hard-wiring is unavailable or not necessary (e.g. in a park). Internet access is not needed.

Accessibility Features

Show the street name being crossed

  • Use a large, high-contrast sans-serif font
  • Keep it visually simple — don’t show the cross street
  • Braille below name; cells are 0.4" tall

Arrow points in crossing direction

  • High contrast
  • Tactile

Device Speaks When Pressed

The entire front of the signal box is a button that triggers audio when pressed.

  • Short press: crosswalk status (walk, don’t walk), name of street being crossed.
  • Long press: additional information including the cross-street, width of the street, and hazard warnings like bike lanes, medians, traffic direction, and bus lanes.

Here’s an example of the long-press audio when the light is red.

Significant consideration is required for what, exactly, should be said and in what order. Our primary goal is safety, so we want to clearly disambiguate the cues for walk / don’t walk by using different word order:

  • Walk: “West 23rd St. Walk sign is on. Cross West 23rd St.”
  • Don’t Walk: “Wait to cross West 23rd Street. Wait.”

Standard APS Walk/Don’t Walk Signals

Like existing APS devices, when the crosswalk is green our device will vibrate (for now, we use a piezo) and emit a “walk” beep tone. When the cross walk is not green, we will emit a “locator” sound instead.

Walk/Don’t Walk Lights

Low-vision people may have trouble seeing the crosswalk light far way; and also, in New York City busses and trucks frequently block the signal. To help this problem, we also add bright “walk” and “don’t walk” LED lights to the top of our device. Note that the “walk” LED is not solid green to avoid red/green colorblind issues.

Unique ID, QR code and Beacon

While it’s not directly an accessibility feature, we propose that we assign each device its own unique ID, with its own lookup URL. This URL is displayed as a QR code on top of the device, and is also broadcast as a BLE beacon (~2m/6ft range).

Top of device, with walk / don’t walk LEDs and etched QR code.

This has several potential benefits:

  • Using a smartphone app, a blind or low-vision person can use the beacon to get precise location information and real-time updates about the crosswalk (e.g. construction nearby).
  • Make it easy to report maintenance issues with a particular device.
  • Anyone who loads the QR code or beacon URL can learn more about city accessibility programs and additional resources.

Building an Accessibility Prototype with a Glowforge

The entire prototype was built and assembled in my apartment in Brooklyn. The sign and and enclosure were designed in Inkscape and etched and cut using a Glowforge. Note that while the Glowforge was an awesome prototyping tool, it’s not what I’d recommend for the final Open APS devices; those will need weatherized enclosures and milled signs.

The sign on the front of the APS is etched into a 6" x 6" piece of black anodized aluminum. I glued on a white acrylic arrow, and also the braille which was etched into black acrylic (0.4" letters, on 5.3" x 0.66" piece).

The enclosure is a 6" x 6" x 2" black acrylic box.

  • It uses tabs created by the “Lasercut Box” Inkscape extension.
  • The top of the box is an etched QR code bitmap image. There are also holes for the LEDs, and an etched orientation arrow pointing North.
  • The bottom has a grid of holes for sound.
  • The front has small holes cut to fit exactly to fit the press-button leads.
  • The back has a hole for the Raspberry PI USB power cable.

The source graphic files look like:

Here’s a video of the Glowforge cutting the enclosure:

Tips and Tricks

  • Laser-etched aluminum looks great but it’s finicky. You want to blast off the top black layer, but not have the laser spend much time on the clean aluminum. Otherwise, the metal just gets hot and warps. I used the manual settings at 0.02" thick, with engraving set at 1000 speed, 30% power, 340 DPI, with 2 passes.
  • I was happy with how the QR code turned out. Black acrylic turns grayish where its etched, providing some natural contrast.
  • Cutting out white acrylic shapes and gluing them onto a black background definitely “pops!”

Braille on a Glowforge

While the Glowforge is great for making high-quality Braille lettering in acrylic, there are a few considerations:

  • Braille needs the dots to be raised. This means you’ll need to etch away all the material around the raised dots. Etch deeply for maximum tactile benefit.
  • There are standard, expected dimensions for cell sizes. We used 0.4" high.
  • Signs should be printed using Grade-2 Braille. This is not a 1–to-1 transcription of written English because it includes many abbreviations and contractions. Use an online translator.

Here’s my process for making a Braille vector image in Inkscape. My goal is to produce a filled rectangle with circles cut out of it, so I can tell the Glowforge to engrave this as a single vector object where filled regions are removed and non-filled areas remain.

  1. Download a Braille font (I used Braille by Cal Henderson)
  2. Type your text into a Braille translator website, using Grade-2 Braille; copy and paste the Unicode result.
  3. In Inkscape, draw a black rectangle. Then, select the font tool using the Braille font and a white foreground, and paste in your Braille.
  4. Select your text and the rectangle behind it. Go the the Path menu > Exclusion. This “carves” the letters out of the rectangle.

I printed this to the Glowforge using the deepest possible etch settings.

The Raspberry Pi “guts”

Inside the enclosure there’s a Raspberry Pi connected to USB-powered speakers, with GPIO pins attached to the push buttons and LEDs. For prototype purposes, we mocked-up the traffic light signal timing, but it would be straightforward to connect this to a wired signal.

The prototype software is very simple. We run stock Raspbian Stretch Lite, and fire up our Python script on boot. We have a JSON config file that drives all application behaviors; and a folder of audio assets (mp3 files). The only difference between each Open APS installation would be this config file and the audio files.

Audio is pre-generated using Google TTS, via the gTTS Python utility. We play sound using mpg123 (don’t use omxplayer; it cuts off the last half second of audio).

We set up a BLE Eddystone beacon following these simple instructions.

Next Steps

We submitted this proposal to the Call For Innovation Challenge on 11/1/17. If our proposal is accepted, we will work on several fronts:

  1. Build a more durable proof-of-concept and deploy on-site at the intersection of 7th Ave and W 23rd St; observe and collect feedback.
  2. Work with local blind, low-vision, and accessibility communities to get feedback and input.
  3. Build a working website that leverages open data to automatically create the signage and software assets.
  4. Partner with a local CNC milling shop to trial cut a batch of custom signs.
  5. Partner with a hardware vendor to build out a weatherized case with integrated speakers, computer, and wiring, such that all you would need to install a device would be to insert a micro SD card, and attach power and crosswalk wiring.

Final Thoughts

Most of the time in this project went into iterating on the physical interaction model and design; the software and wiring was relatively straightforward by comparison. The design problems of deciding how much information to convey, and how, is made an order of magnitude more complicated when you consider different modalities and abilities. Thomas and I iterated on this many times until we settled on our prototype design. Having in-house maker tools like the Glowforge made this process much faster and more fun because we could keep trying various ideas while hanging out.

Many thanks to Chancey Fleet for her thoughts on existing APS solutions and areas for improvement; and to Karen Gourgey of PASS and the Baruch College Computer Center for Visually Impaired People for her consultation on pedestrian safety.


Published by HackerNoon on 2017/10/27