Preventing Chrome from closing with JAWS scripting

One of my pet peeves while using Chrome is when I’m listening to something in a browser tab, working in another one and then decide to close the current one. I press Alt+F4 instead of control F4 and my audio stream is interrupted.

Sure, you can relaunch Chrome and then press Control+Shift+t to regain all of your tabs, but you still have to find your spot in the audio stream.

Well, I’ve devised a little JAWS script and figured I’d share it here.  Yes, I know that Edge has this built-in to their browser, but I tend to like Chrome

better… and I like tinkering with code.

So here are the steps:

  1. Launch the JAWS script editor while you’re in a Chrome window with Insert+0 on the number row.
  2. Press control+end to get to the bottom of the script file.
  3. Press Control+E to create a new script.
  4. Type in a name for the script, something like PreventsClosing and press Tab. Note, this edit box does not accept spaces, so use capitals if you like.
  5. Check the  box that says: Can be attached to key, and press Tab.
  6. Type in what ever you want in the Description and Synopsis edit boxes.
  7. Tab through to the Assign to hotkey area.  Press Alt+F4. The window won’t close on you.  Instead the key sequence will be copied to this field.
  8. Tab to okay and press Enter.
  9. Now you are ready to copy the code below:

int button
button = ExMessageBox (“ARE you sure?”, “Close Chrome”, MB_YESNO|MB_DEFBUTTON1)
if button == 6

Once you copied the above code, press control+s and the script will be compiled.
Press Alt+f4 to close.
You can now test your modifications by pressing Alt+f4 in Chrome.
A message window will popup asking you if you really want to close Chrome.
Press Enter on yes and it will proceed.  Otherwise, choosing the no button will keep you in Chrome.



Posted in Uncategorized | Tagged , , | Leave a comment

Video games. Yeah, blind people play them too

Video games have always been somewhat of a passtime for me; especially when I was a teenager during those long Winterpeg months.  Oddly enough, I was always attracted to games with intricate sound environments.  My favorite game was one created on the old Coco2 from Radioshack called “Dungeons of Daggorath”.  There’s a port of the original game for windows 10 which you can find here:

Dungeon Of Daggorath Download

Because of how the sound design was constructed and the fact that you were confined, well to a dungeon, the game suited me perfectly.  The spiders, snakes, giants and knights would make their own particular sound and would grow louder the closer they got.  I memorized the 4 levels of the dungeons and was able to defeat “The big Boss” which turned out to be an old evil wizard.  Anyway, a game that was designed for sighted people was completed by a blind guy.  I’m sure I would have posted my gameplay on Youtube if it had been around.  Now you can do a search on that platform and find multiple blind people playing off-the-shelf games just as well as any sighted person because of accidental accessibility.  But, the times, they are a   changin.  I won’t get into too much of that just now, as I have gone longer than I had expected, but you can find more about the proposed legislation here: New games must comply to accessibility guidelines after FCC waivers.

So there’s that.  But then, fast-forward to five years ago.  Well, that would be rewind, but nevertheless, I had a chance meeting with a guy named David Evans.  He had this idea of creating an audio game for the blind.  Granted, it’s not a shiny new concept; you can find tons of audio-based games throughout the interweb (yes I said interweb), but what he was bringing to the table was, in my humble opinion, refreshing.  Apply the same sort of creativity and quality to an audio-based game which would be for the most part designed for blind gamers, but would have enough appeal to pull in sighted gamers as well.  But enough of my reminiscing. Let’s let David tell us in his own words why he decided to start on this project.  We were both interviewed in the “Niagara this week” newspaper.  You can follow this link to read the story.

St. Catharines company creates video games for the visually impaired

Posted in Uncategorized | Leave a comment

How to quickly toggle back and forth between screen readers

One of the things I like when I’m testing web pages is efficiency.  I also like to minimize key presses and procedures of loading and unloading programs.  In particular, bouncing back and forth between JAWS and NVDA.  So, with the help of AutoHotkey and a few lines of JAWS scripting, I managed to create a screen reader toggle key. Now every time I press Control+F12, it unloads the current screen reader and loads the other one.  What follows is how to set this up on your own system.  It takes a few steps and a bit of know-how, but once it’s set, it’s pretty darn sweet.


Setting up JAWS

First thing is to create the shortcut key/script to unload JAWS.  Yep, you’re going to make a JAWS script.  It’s actually not too difficult and I’ll walk you through it.  Let’s get to the JAWS script editor by pressing Insert+0; that’s the 0 on your main keyboard, not the number pad.

Next you’ll have to make sure that you’re in the Default script file by pressing CONTROL+SHIFT+D

Still with me?  Good. Now press control+End to get to the bottom of this script file.

Next, press Control+E to create a new script.

This is essentially a form to gather information for your new script that you navigate with the Tab and Shift+Tab keys.  The first field that you land on is the script name.  I called mine Kill JAWS, but you can name it whatever you want.

The following field is a checkbox which you’ll need to check, as you want to connect this script to a shortcut key.

The neighboring 3 fields are: Description, Synopsis and Category. JAWS uses these fields to incorporate into the help system. You can put whatever you want in there.  Or just leave them blank.

The last field is labeled “Assign to”. This is where you designate your shortcut key.  I used Control+F12, but you can use whatever key combination that you like.  Just press the key combination and it will populate the field with the text equivalent.  If the shortcut is already in use, JAWS will warn you and you’ll have to choose something else.

Finally, your next tab press will bring you to Ok.  Activate it and let’s get to the fun stuff.


You are now brought back to an editing window.  Your cursor is located between the script name and the “EndScript” command.  This is where you will copy and paste the 3 lines below.  I will explain what they do following the lines of code.



Run (“nvda”)




The first line runs NVDA. No need to  give it a path, unless you have it installed as a portable version.  Otherwise, you’ll need to explicitly point to its location.


The second line actually calls another script named shutDownJAWS() Oddly enough, it does exactly that.

The final line tells JAWS to wait (or pause) for a moment.  Depending on your PC speed, you may have to add a few of these lines.


Okay, so we’re done this part. Now, let’s save your work by pressing Control+S.  If all is well in the world of Jaws scripting, the script will be compiled, JAWS will say: compile complete and you’re set.

If you get an “unexpected run command encountered”, you’ll need to edit the NVDA line.  Erase and retype the quotes.

Thanks to StickBear for these modifications/suggestions.


Now close the scripting window with Alt+F4 and let’s try your new script.

Just make sure you have a way of running JAWS again, or you might find yourself with a very quiet computer.  Control+Alt+J usually does the trick.  If not, Press Windows+r to execute the run command, type in JAWS and the version number.  For example, JAWS2020. Then press Enter and it should load up.

Congrats, you’ve made your first JAWS script.


Integrating Autohotkey


AutoHotkey has been around for years.  It’s a powerful, easy to use, shortcut key creator and script automator; is that even a word?  Anyway, you’ll be using this tool to Unload NVDA, and load JAWS back into memory.

Let’s go find AutoHotkey.  You can find it at this link:

Spend some time on the main page to see examples of what Autohotkey can do.

I’m not going through the download and Install process, but I would recommend downloading the current version, so we’re all on the same page.

Once installed, Autohotkey runs in the background and will have a system tray icon.  We’re going to go find that icon, activate it and get into the menu.

Press Windows+D to get to the desktop.

Shift+Tab until you get to the System Tray Icons.  Once there you can navigate these icons with the arrow keys.  You’ll be looking for an icon that reads: “settings.ahk”, or something similar.  The .ahk file extension is reserved for any AutoHotkey script files.  Once there, press either the applications key, or Shift+F10.  This will bring up the AutoHotkey menu.  Arrow down to “Edit ThisScript” and press Enter.

Notepad will load up and we can now edit our NVDA to JAWS transition script.  Copy the lines below into that script.  I’ll explain each line’s function after the code below.



Run c:\progra~2\nvda\nvda.exe -q

Sleep 10

Run jaws2020



The first line designates the shortcut key, which in my case is Control+F12.  The caret sign represents the control key. Number sign # is for the Windows key. The Exclamation point ! is for Alt and the plus sign + is for the Shift key.  You can find more info On AutoHotkey bindings here:


Back to our line description.  Following the caret symbol is the designated key, which in my case is F12.  The 2 colons following the key designator tells AutoHotkey that this is a shortcut and it also indicates the start of a new script.


The next line in our script tells NVDA to shutdown.  This is done with the /q commandline switch.

This will only work if you installed NVDA in its default folder.  Otherwise, you’ll have to modify this line.

Also note, you don’t’ need to double the backslashes for commands in AutoHotkey scripts.


The Sleep command pauses the script for a user-defined time; in this case 10 milliseconds.  You may need to tinker with this if things don’t work properly.


The second last line runs JAWS.  This line will have to be updated when ever you go up a major version.


Finally, the Return command signals to AutoHotkey that the script is complete.


Once you’re done editing the script, press Control+S to save your work and Alt+F4 to get out of Notepad.


Final step is to go back to your Desktop With Windows+D,  Shift+tab until you get to the system tray, go find the AutoHotkey settings.ahk icon and activate the menu with Shift+F10.  Arrow down to the “Reload This Script” option and press Enter.


You are now in business.  Press your previously created shortcut key and it should unload the currently running screen reader and load the other one.  This setup doesn’t care which screen reader is loaded first.  At any rate,  there you have it; a fully automated screen reader toggling shortcut key.


Further Reading and Resources

AutoHotkey Documentation:

JAWS scripting, from Freedom Scientific:


Posted in Uncategorized | 4 Comments

When public services fail and put people in jeopardy

It’s been interesting to lay low in the last few weeks, playing my part in the fight against keeping that pandemic at bay. I’ve been busying myself in tinkering with a Raspberry Pi, figuring out how to insulate our cold storage to make it part of the rest of the basement and various other mundane projects. Oh, and spending quality time with my wife of course. 😊
One thing I didn’t expect however was having to deal with our local 911 service and in extension, our medical system.

Friday morning, my wife woke up in excruciating pain. First off, you must understand that she has a high tolerance to body discomfort, as she has a condition that creates chronic pain. So, when she says, she’s in pain and she can’t stand it, I know there’s something definitely wrong; especially when she’s the one who asks if I can call 911 for her.

My first call to 911 was rather annoying. I’m sure the woman I spoke to was probably over-worked, but it’s still not an excuse in the way I was treated. She would ask me questions and promptly cut me off while I was trying to respond. I was told at first that an ambulance was on its way, but there were more questions she still had to ask. I answered them and then she informed me the ambulance was canceled as she deemed our situation was not of an urgent nature and then I was being transferred to a nurse that could help us over the phone. The nurse turned out to be a reception desk clerk who said someone could return our call, but it would be within the next 2 days. I hung up and decided to call TeleHealth; a phone line that delivers a similar service. Annoyingly enough, I was informed that the timelines would be the same for that service as well. I finally lost it and told the agent: “Listen, I’m blind and have no means to drive my wife to the hospital. My wife has mobility issues and can’t walk long distances.” Suddenly, I was given a time slot within the next hour. I hate having to use the disability card, but it seems in this climate of paranoia and sparse resources, a lot of genuine emergency cases are being deemed as non-urgent.

After hanging up, I figured I would try again with 911. I made myself sound more desperate and played the disability card again. This time, the agent actually listened to me and an ambulance was finally dispatched. And you know what? I don’t feel guilty one bit. My wife was told by the doctor that if she would have waited, she would have gone into sepsis and it would have been too late for her. She had kidney stones and an infection. So, if I had played by the rules and waited for that call… she would have died! I’m not being melodramatic here, I’m just stating a fact. I understand that things right now are quite atypical, but at the same time, perhaps the 911 agents might want to spend more time listening rather than trying to find an excuse to deny emergency service to people who genuinely need it.

Posted in Uncategorized | Leave a comment

When the Current Government Forgets Their People

Today on my blog, I decided to have a guest write an entry.  Incidentally, it’s my wife.  🙂 Here in Canada, more precisely in Ontario, Our elected officials have been making radical changes throughout government services and this will negatively impact  all of us.  Although, I tend to shy away from political topics, I feel that it’s important to have intelligent dialog and to make sure that we all understand the impact of the decisions that politicians supposedly make “for the people”.

Without further ado, here is her article.


When the Current Government Forgets Their People

By Erin Courcelles


I would like you to try and fully understand the damage that is being done by the conservative government in Canada. My husband (Martin) and I are from the generation that fought for the right to be educated despite having disabilities. My mother and I were told I would NEVER get pass grade 9 but she refused to listen. Thankfully I didn’t either and I graduated from higher education with honours, twice!

Martin’s father was told to send him away to a school for the blind in another province, instead his father painstakingly learned braille and made sure that he was offered as many of the same opportunities as his cohorts as possible.  I know of many stories of people who fought for their place in school/university by making sure textbooks, test and lecture notes were made available in alternative formats. Because of this, we and our friends hold decent jobs, own property, pay taxes, take vacations, go out for girls or boys night out. We contribute to society. We are like everyone else despite the small investment made to level the educational playing field.


What Doug Ford, Jason Kenny and all the conservatives are doing by preventing disabled children from going to school means that they will not be employable. The jobs that they will hold will be the ones without benefits and they will be assured a place in the working poor but more than likely, they will be on some form of low-income support.


I can talk about how this is a human rights issue. In fact, that is how we fought for our rights for an education decades ago, but let me tell you how this will affect you. YOU, the able-bodied person who wants to retire in the next 15-20 years, this is the damage we see with shortsighted politicians.  Ford wants to save $1000 now, even if it is going to cost us $20,000 later. Because he will not be around, but neither will your pension.  We have been saying for years that 15% of Canadian’s are disabled, but the new data on disability in Canada, 2017 says it is actually 22%.  You may think this statistic is mostly seniors, however, according to the 2017 census, more than 540,000 Canadian youths aged 15 to 24 years (13%), had one or more disabilities.  In 2017, the employment rate was 59%. You may think that that is bleak compared to the 80% non-disabled population, but in 2011, the employment rate of Canadians with disabilities was 49%. How does this affect your pension or your taxes? 41% of the low‑income population are Persons with a disability. In Ontario, in April 2019, 54% of the households on social assistance were on ODSP. Imagine what will happen when we stop valuing 22% of our population to the point of not educating them?


More and more, we are learning how to accommodate our students in our ever-changing world. We know how to level the playing field. For the most part, we know how to inspire and engage students with disabilities. With the disabled community growing and the employment gap decreasing, we know how to prevent the growing population of 22% from being on welfare.

In turn, the last 10 years we have seen the disabled population start businesses, advocate for themselves, teach and make advancements in technology, in ways we could never have imagined. For example, parents with strollers should thank wheelchair users for curb cuts, elevators and automatic doors.  Personally, I think everyone needs to thank the visually impaired for talking technologies, which is now part of smart homes throughout the world.  Or the foreign students who thought closed captioning was there to help them learn a new language, which was actually invented to help the deaf.


When we do not accommodate, a growing number of the population will suffer.  That is a serious problem. I am not saying there won’t be people like Frida Kahlo who had the artistic talent to overcome her challenges or Helen Keller who inspired many and battled for everyone. I am saying that we have the ability to make sure that everyone has the opportunity to go forward together, as a cohesive and productive society. If we do not address this, the people of Canada will be stunted AND we will have to pay to support the people we left behind.

Posted in Uncategorized | Leave a comment

Acting Blind, It’s A thing!

When it comes to portraying people with disabilities in movies and/or series, it seems like producers are still missing the mark. Just a little caveat on my post. The topic is blind-centric, because that’s what I know. Other disabilities, (both visible and unseen) need to also be considered. Don’t get me wrong, I thoroughly enjoyed the “Daredevil” series, because everyone knows that I Am Daredevil! All kidding aside, Charlie Cox who played Matt Murdock (Daredevil) did a fantastic job, but I keep wondering… why wasn’t a blind person chosen for the part? Aren’t there blind actors out there? Do you have to be sighted to be a good, convincing blind person? Admittedly, I did consult on a series a few years back to help a sighted actor (act more blind), but I was pulled in after the series had already started. Turns out he was a nice guy, even took me out for beer and really got into his portrayal… did a pretty good job at that. I guess what I’m getting at is,


there needs to be a more concerted effort from the acting industry to reach out to the disability community and have a better representation of people with disabilities on screen. Things are changing, but it seems like a very slow process.

Just recently, I happened upon a casting call for a commercial where the company decided to reach out for a blind person to fill out one of the roles. Not because it was a blind role, but because they thought of being inclusive. I thought it was a breath of fresh air, still didn’t get the role though as they decided to go in a different direction, at least the thought was there.


This whole diatribe of mine stems from a series my wife mentioned to me last night called “In The Dark”. It sounds like a fantastic show and I’ll most likely watch it, because the main character is a blind woman with a guide dog. but you guessed it, the actor is not blind. There’s an article which describes as to why the producers came to this decision, so I’ll leave it to you to read it, but, I don’t know. I’m fairly well connected to social media, the blind community and the disability community in general, but I never did hear of a casting call for this part. Perhaps the industry just isn’t casting their talent acquisition feelers far enough, or the organizations they contacted just didn’t know how to handle their request? Who knows. At any rate, the article is found at the link below. I’d be curious to hear people’s comments.


‘In the Dark’ Producers: Why We Didn’t Cast a Blind Actress in Lead Role

Posted in Uncategorized | Leave a comment

Digital Accessibility: A Love / Hate Relationship, Wed, Feb 13, 2019 at 6:00 PM

I’m participating in a Digital accessibility round table in Toronto. See description below with a link to the Event bright tickets.
It’s opened to the public.  Space is limited.


Digital Accessibility: A Love / Hate Relationship, Wed, Feb 13, 2019 at 6:00 PM |


Join Wealthsimple and Fable Tech Labs to meet and learn from Toronto’s technology leaders in inclusive design and digital accessibility.

6:00 – 6:30 p.m. : Networking
6:30 – 6:45 p.m. : Keynote
6:45 – 7:15p.m. : Panel
7:15 – 7:30. : Q&A
7:30 – 8:00 p.m. : Networking

Light refreshments and food will be served. There will be ASL interpreters at the event. To access the elevator, please enter through the 862 Richmond
Street West entrance (left side of the building). There will be signage to direct you.

For additional accommodations, please email

Samuel Proulx is an expert technology user today, but it didn’t start that way. Sam has had a love/hate relationship with technology since he was just
ten years old, using Windows 1995. As a blind user, he found the web to be an equalizer. As the Internet blossomed though, Sam grew with it, but has found
the equalizing effect it once had to be lost overtime.

Through his frustrations, Sam believes the Internet is strongest when it’s built by its users. Sam has managed online communities in various spaces for
18 years; he brings this expertise to Fable, helping us build an inclusive team of people from all walks of life, which spans across the entire country.

Description of Panel

Everyone has a different relationship with technology, often shaped by their abilities, opportunities, and life experiences. Our five panelists will touch
on the unique relationships that they have built with technology through online and offline communities. With topics ranging from first experiences with
technology to the independence technology has provided them, this discussion is sure to shine a light on the power, for better or worse, of technology
and community alike.



Daniella Levy-Pinto, Accessibility Expert and Consultant

Martin Courcelles, Senior Accessibility Technology Specialist at OLG

Ka Yat Li, Accessibility and Usability Consultant

Vu Nguyen, Full-Stack Developer

Minette Samaroo, Accessibility Tester and Advocate of AEBC

EventBrite Tickets

Posted in Uncategorized | Tagged | Leave a comment

I Haven’t a Clew

It may seem that I don’t know how to spell, but turns out I found an interesting and intriguing little app. Clew is a way-finding utility that instead of relying on infrastructure such as beacons, uses the camera on the iPhone to record video and generates a breadcrumb-like map for the user to follow. Here’s what happens:
 The user taps on the “Start recording” button within the app,
 The user then walks with the phone’s camera pointing away from them, recording their surroundings,
 Once arrived at their destination, the user taps “Stop Recording”,
 The user can then tap on “Start Navigation” and “Get Directions”. The phone will then guide them by means of speech and sound effects back to their original starting point.

That’s mainly all there is to the app at present, but I imagine the app developer will add features such as saving routes, or sharing routes at a later date. Although this is a way-finding app geared towards blind users, I can imagine that once other features are added to the app, more people would find this rather handy.
You can find the app detailed description, a link to the IOS app store along with a video here:
Clew App Page

Posted in Uncategorized | Leave a comment

It’s all about perspectives

For those who know me, you’ll inevitably have seen how I will shamelessly sacrifice myself for a good joke. Well, that content is more suited to Facebook, but nevertheless. I happened upon an NY Times Opinion piece Entitled: “How to really see a blind person”. It was well-written and did have some interesting content, but what hit me was how when I started comparing notes and wondering how I would react in the situations he describes, I realized
our experiences were so different simply because of perspective.
For example, For him, going through an airport is a nightmare. For me, it’s a chance to meet new people and
crack jokes with staff and security. For him, blindness is an obstacle, for me, it’s an ice breaker.


Don’t get me wrong, there are days where I have pity parties, but again, it’s all perspective. There is no correct way to deal with blindness and we all cope differently. I guess I’m simply saying that I appreciate who I have become and enjoy laughing at myself. Being blind can be a regular sitcom, if you have the right perspective.



How To Really See A Blind Person



Posted in Uncategorized | Tagged , , | Leave a comment

A True Accessibility Win

Accessibility has always been a challenge for people with disabilities. I’m fortunate to live in the City of Toronto where I have the Accessibility for Ontarians with Disabilities Act, AODA. Enacted in 2005, it is supposed to enhance accessibility within our province and make Ontario fully accessible by 2025.
But now, there’s also whispers of a proposed federal accessibility guideline similar to the Americans with Disabilities Act, which possibly could make things easier for all Canadians with disabilities; not just key provinces.

Even though the act has made things easier for a person with a disability, in Ontario, there are still numerous physical and discriminatory challenges that frequently happen. Take for example riding transit. For a person with a mobile disability, not all subways are wheelchair accessible. The ones that are accessible may have issues such as broken-down elevators or elevators that are in perpetual repair. There’s one at Kennedy station for example that has been down since October and the wait for it to be back in service keeps being extended. See this story: Elevator down for Repairs

Snow proves to be quite the challenge as well. If sidewalks aren’t cleared after major storms, (which happens), this means anybody using mobility devices are stranded.
I recently had to reach out to my city representative, as it had been more than 72 hours and the sidewalks had still not been ploughed which could be deemed a safety issue for everyone. Things seem to have improved since then. For information on sidewalk clearing in Toronto, read this story: It snowed. Now who has to do the shoveling?

As a blind person, I frequently encounter discrimination when using taxis or ride share services. The most recent situation was last Tuesday morning when an Uber Assist driver stated he had dog allergies and drove away. To his credit, he did order another car for me, but that’s beside the point. If you’re going to drive for Uber Assist, you are going to encounter dogs. I gather the one hour online training had not sunk in with this particular driver. I reached out to both Uber and Uber Canada on Twitter, but they have not responded as of this writing. Turns out this is not an isolated issue either. See this story: UberAssist driver fined for denying ride to Paralympian

Okay, but what about the accessibility win you ask? Well, this is exciting, for me anyway. I frequently use the Kennedy subway station. Currently, there is a lot of construction around that area due to the Eglinton Crosstown project. Pedestrian traffic has been affected and rerouted via a lighted intersection. This is all fine and dandy, but the lighted intersection did not have audible pedestrian signals. Most of the time, this isn’t an issue, as I can read traffic fairly well. But this particular intersection has 2 bus turning lanes, pass through traffic for the parking lot and passenger drop off area; so yeah, pretty busy and confusing.

Addressing this issue proved to be more challenging than I first had thought. The transit folks said it wasn’t their issue, since the intersection is owned by the city, even though it’s on transit property. Turns out this was true, as what I thought was a driveway is actually a street. So off I go to bug the city about the intersection. They stated they could fix it within 4 hours. I was ecstatic. It was short-lived however, as I got a callback to inform me that a 4 hour turn-around is for a broken or defective traffic light. In order to have an accessibility assessment for this crossing, a request would have to be created and this would take a minimum of 9 months for it to be acted upon. I put in the request, without too much hope. Four hours to 9 months for a solution is pretty disheartening.

A few days later, I figured it was time to try a new tactic.
I decided to approach the Crosstown folks and my local city representative… and this is where the magic begins. I explained the situation via a well-worded email. The response was swift, (under an hour), and audible pedestrian signals were installed by end of day! That sort of thing is unheard of in Toronto. It’s been my experience that to install an audible signal on an existing traffic light location, it takes up to 3 years for it to happen. The last 3 requests that I have made through the city were installed after I moved away from the areas and that defeats the purpose; although I’m sure other people benefit from them.

Anyway, all this to say a simple thank you to the folks at MetroLinx working on the CrossTown project and also the construction team that installed the audible signals. It’s because of your swift response and actions that myself and others are now able to safely cross a very busy intersection. This is a true example of how accessibility can be quickly implemented when we all work together. Hopefully, the city of Toronto could learn from this example and expediate the installation of audible traffic signals. Why not just make it a standard that all new traffic signals include the audible feature!

Posted in Uncategorized | Leave a comment