Menu

Friendly Technology / Apple, the words that are missing from Siri’s vocabulary. Even more so with AirPods 2!

1

Face ID + Siri + AirPods 2 = an almost perfect usability and accessibility tool for Disabled users. What is missing in iOS and macOS that could enhance Siri’s performance?

Translation by Giovanna Roth, Thanks Giovanna!
Original article in Italian language.

@CFFollis

Disabili DOC – Carlo Filippo Follis

«How to make technology our friend» is the title of the chapter of the book Riprogettiamo il D-Mondo” that discusses what is illustrated in this article and whose conclusion is that disabled people need to be more involved in technological choices and the development process.

Where to buy the Book or the eBook

Starting from the needs of Disabled people, the history of domotics shows that it is possible to improve everybody’s lives. The first systems of home domotics helped people with disabilities to manage their homes by interacting with front doors, household appliances, roller blinds, beds, TVs and video recorders and everything that could be controlled by voice or blow commands, or by switch and scanning systems.
After almost three decades, we now have smart homes and with just a few euros we can control a smart socket that can switch all sorts of appliances on/off. The smartphone has become a mobile control center that looks after our needs.

Ironically though, the requirements of Disabled people seem to be forgotten when new products, hardware and software, and services are designed and developed. Why does this happen? Are the developers of sophisticated systems bad people? Are they working against us?
Of course not!
The problem is that the recipe for a perfect future often does not include an important ingredient: the contribution of Disabled people.

The intent behind this article is to ask Apple for an evolution of their operating systems that would make them more accessible to all Disabled users.

A final comment: In this article I speak of Apple products mainly through personal knowledge and experience, but the concepts are universal so I hope that android users won’t feel left out…

… and then there was touch

About a decade ago, a new and innovative process began that culminated with the introduction of touch systems. At the time, Steve Jobs declared that human fingers are the best devices (nibs) that have ever been invented.

However, if Disabled people cannot use their fingers, or can only use them with difficulty or dystonically, how can they interact with their device, smartphone or tablets efficiently and in the best way possible? Why do mobile device often lack the possibility of working in conjunction with external peripherals such as a mouse or a trackball?

One of these contradictions might be solved during the WWDC 2019 – Worldwide Developers Conference – that on 3 June could witness the launch of a new iOS 13 that will be able to combine devices such a mouse, a touchpad and so on with iPads and iPhones. This is what a lot of people are hoping for.

iPhone X, the turning point of Face ID

On September 12, 2017 something changed forever. With the presentation of the iPhone X, Apple introduced Face ID, which is a facial recognition system able to recognize the owner of the iPhone without having to digit the unlocking code or to use fingerprints to access the device. These two old procedures, for example, were quite challenging for somebody like me, with distonic disabilities caused by a spastic quadriplegia.

Face ID immediately became a tool of independence but it was not alone…

Siri, voice and ears of iPhone, iPad & Mac

Siri is what the world defines a “voice assistant” even though it is much more than this since it does not only talk but also listens and interprets our needs. That is why it is more of a “personal assistant”.

Siri is also able to execute commands: if you say «Ehi Siri, launch Safari» you will see Safari opening up on your screen. There are many other similar examples.

The value of Siri was emphasized in 2018 with the release of a new application called shortcuts. Why is shortcuts so important?

The answer is quite simple: the application allows the user to create a number of actions (micro utility apps) that can be recalled vocally through Siri.
For instance, if I have programmed or downloaded a command on my iPhone or iPad I can simply say «Ehi Siri, “take me home”» and I will find my way home since the system, through my command, will find my location and will calculate the route.

All this is fantastic!
There is, however, a big BUT. To-date it is not possible to manage some basic actions that have not been taken into consideration by the Apple team that is in charge of Siri’s future enhancements, maybe because they are considered to be too basic.

Siri and “missing word”

As I said before, the engineers of Apple or of other manufacturers are not incompetent or bad people, simply they are not Disabled. This is why large companies that can afford it, should invest in a team of Disabled consultants. D-Consulting.

Siri first and then Siri in combination with shortcuts, as far as I have been informed, is still not able to execute some basic commands. Let’s have a look at some of them:

  1. The application cannot take us to the Home screen.
    In other words, it is not possible just to say «Ehi Siri, go Home». Today with Face ID and Siri I can unlock the iPhone but in order to reach the Home page I have to physically swipe the display with an upward movement. This is not logical! It would be enough to introduce a system preference entitled «Where do you want to be once the iPhone has been unlocked?»; a menu could be offer logical destinations such as “Home”.
  2. I cannot answer or end a call
    This is completely absurd. Even if it is an excellent tool, Siri does not allow me to answer and subsequently end a call by simply leaving the App.
    This makes no sense!
    It is even more illogical if we think that the “Telephone control centre” has been managing and centralizing other types of calls for some time now, such as those coming from WhatsApp just to mention one.

I just made a couple of simple, but important, examples to suggest a solution to Siri’s unmotivated limitations.

Siri and touch, the most important missing function

In the last two releases of iOS, especially with the new iPhone models – from the X model on –, to close all opened Apps you have to swipe up the screen to pre-close them status and then drag the miniaturised App screenshots upwards thus closing them.
If all opened Apps could be closed just by swiping down, the user would have to do much less: with just one movement, he could close all opened applications. This would be an improvement for all, not just Disabled users.
The introduction of such a multiple closure command would give the opportunity to associate a unique vocal command using the procedures described in this article. It should be noted that, to date, Siri is not able to close just a single App.

Implementing something like this would be excellent for all people concerned and not just the Disabled.

Siri and my extra words [+]

Disabili DOC – A partial view of the working station of Carlo Filippo Follis, in the foreground there is the iPhone X

The image shows part of my working station, seeing this image will help to better understand what I will explain below.

What follows takes into account, amongst other things, that there are a lot of Disabled people that cannot speak clearly, like me for example, although I do not have any difficulty making myself understood.

This is why there should be a dedicated area in which to set up actions based on an acknowledgment text (which will not be the title of the command) and on the possibility of instructing Siri in the combination/phonetic recognition based on that particular text string.
This would allow to generate vocal instructions linked to text that may not necessarily be logical, so it would combine a non-existing word or phrase to a unique vocal command.
In the Italian language words like «Telsi» o «Xtel» do not exist, however a sequence of letters associated to a personalised phonetic could generate a fast command.
Personally, I keep my iPhone X vertically on a wireless charging station positioned next to the keyboard of my Mac, but because I have difficulties using my hands, what happens is:

Disabili DOC – The first smartphone pen I used holding it between my teeth...

The image shows the first pen for smartphones I bought years ago from Informatica Biella, my trusted Apple store..
With constant use, I pierced it with my teeth (ruining them too in the process). So perhaps it would be worth improving something that does not require a lot of effort… what do you think?

  • the iPhone rings;
  • if I am working on my Mac, I answer from my Mac with a click;
  • if my Mac is off, I turn my head, I unlock the iPhone via Face ID; I pick up the smartpen for iPhone with my teeth, and I touch the display to accept the call. I then put down the pen that I hold between my teeth (that would prevent me from talking in a comprehensible manner) and I start to talk;
  • provided that the caller, after waiting all this time, has not given up and ended the call

In the event, though, that Siri were able to recognise my correlated intention, e.g. «Telsi» then we would have the following scenario:

  • the iPhone rings;
  • I turn my head to the right, I unlock the ‘Phone with Face ID, I say «Ehi Siri, “Telsi”»;
  • I talk.

NOTE: In the second scenario I would not have to go through so much trouble or break my teeth (and I am not joking) on a metal tube that has electricity running through it as the touchsreen is of the Capacitative type.

Let us not forget that a lot of Disapled people have serious speech-related difficulties. A system able to deal with a repeated and finally confirmed vocal instruction (as happpens when Siri is activated the first time) is a system capable of interpreting a simple, even guttural sound, and act upon it.
One may think «If a person cannot speak properly, why do they need to answer the phone?»
A relative, a member of the family or even a friend might just want to say «I will be home in 10 minutes!»

For the reasons outlined above and for many others, it is desirable for Siri to be programmed to execute simple instructions that are really useful to all.
If the App Shortcuts is strongly loved by all users, let’s make it even more useful for the benefit of those who are already struggling to manage the simplest actions of everyday life!

The “magic” of AirPods 2

Disabili DOC – Apple AirPods 2


The Apple AirPods 2 earphones allow direct control of the associated device through the functions supported by Siri. It is enough to say «Ehi Siri» to receive help form your Vocal Assistant.

AirPods 2 have just been launched on the market. Unlike the previous model they support Siri. In other words, it is possible to issue a command to Siri and through the headset, this will be transmitted to the associated device.

This innovation has been celebrated by all as it is objectively useful. Since Apple owns all the necessary hardware and software, why not implement the idea proposed by this article?

I am convinced that it would not be a problem for Apple to develop what I ask in the name of all abled and Disabled people.
The problem is to convey this request to Apple so it can evolve its iOs and perhaps even macOS.

Let’s not forget macOS (OS X)

Siri also operates in macOS, so it is easy to understand how such a development could greatly benefit the quintessential work tool: the computer, the Mac.
The solution I described could be comparable, on Mac, to a vocal version of TextExpander, even more so if “Shortcuts” could also be applied to macOs.

We have to bear in mind that today the computer is the instrument the vast majority of people with disabilities use precisely because, to date, it is the hardware to which all the various peripherals offered by the market can be connected and which, in many cases, have a value of aid in the direct use of the machine by those who live with any disability.

Beyond communicative minimalism

Disabili DOC is not a magazine specialized in technology, computer or smartphone and tablets. I have therefore chosen to offer you the ABC of a more complex idea. I practically only describe the user-side operation with set “commands”.

The aspect related to how this additional part of iOS or Commands should be designed to be as fluid as possible, especially if handled by a Disabled person, is a more complex topic that should be explained to Apple engineers, the only ones able to accomplish this project.

Conclusions

At the base of a technological evolution that can truly be useful for everyoneit is necessary that the different skills of Disabled people can guide and complete the vision of the future that is currently being written.
The Disabled must transform their different disabilities into an added value that enhances generic knowledge, the one standardized by normality.
After all, not only will disability itself never be defeated but it could also become part of the lives of those who never imagined something like this could happen to them.
The attention we give or receive is – in fact – an insurance policy for everyone’s future!

Share.

About Author

Sono nato il 25 febbraio del 1963 ed a 23 anni ho coronato il mio primo sogno d'impresa: un'attività commerciale che durò per circa vent'anni. Dopo un periodo sabbatico fondai nel 2009 Ideas & Business S.r.l. che iniziò la sua opera come incubator di progetti. Nel 2013 pensai di concretizzare un sogno editoriale: realizzare un network di testate online. DisabiliDOC.it è la seconda testata attiva dal 16 febbraio 2015. Altre già pensate e realizzate prenderanno vita pubblica nei prossimi mesi. Per ora scrivo per passione come per passione ho sempre lavorato per giungere alla meta.

Partecipa

  • 1 commento

    1. Pingback: Apple, le parole che mancano a Siri. Tanto più con AirPods 2 - DisabiliDOC.it

    Leave A Reply

  • Commenti su Facebook

  • Commenta tramite Google+

    Powered by Google+ Comments