Whats holding ~100% D GUI back?

rikki cattermole rikki at cattermole.co.nz
Wed Nov 27 17:29:13 UTC 2019


On 28/11/2019 6:08 AM, Gregor Mückl wrote:
> On Wednesday, 27 November 2019 at 16:34:33 UTC, rikki cattermole wrote:
>>> Having said that, there's applications that definitely should be 
>>> accessible. There's others where a certain kind of accessibility 
>>> support is just not possible. Screen readers and Photoshop obviously 
>>> make little sense in combination, for example. And accessibility is 
>>> different from automation. There's different APIs for that on 
>>> different platforms.
>>>
>>> So, in general, I'm still not seeing a proper justification for not 
>>> building on top of the work that already exists in dlangUI.
>>
>> UI automation API's is how accessibility programs like screen readers 
>> work.
>>
>> "Microsoft UI Automation is an accessibility framework that enables 
>> Windows applications to provide and consume programmatic information 
>> about user interfaces (UIs). It provides programmatic access to most 
>> UI elements on the desktop. It enables assistive technology products, 
>> such as screen readers, to provide information about the UI to end 
>> users and to manipulate the UI by means other than standard input. UI 
>> Automation also allows automated test scripts to interact with the 
>> UI." - 
>> https://docs.microsoft.com/en-us/windows/win32/winauto/entry-uiauto-win32
> 
> No, this is only how Microsoft chose to name their accessibility 
> interface. Google and Apple use accessibility as the relevant term, 
> including in their actual API naming. The same goes for Gtk and Qt, 
> which are the most common implementations of accessibility features on 
> top of X11 (sadly, the X protocol doesn't have any notion of 
> accessibility itself). Please don't use the single outlier's unique terms.

"Action methods. The NSAccessibility protocol also defines a number of 
methods that simulate button presses, mouse clicks and selections in 
your view or control. By implementing these methods, you give 
accessibility clients the ability to drive your view or control."

1. The user says, “Open Preferences window.”
2. The screen reader sends a message to the app’s accessible element, 
asking for a reference to the menu bar accessible element. It then 
queries the menu bar for a list of its children and queries each child 
for its title. As soon as it finds the one whose title matches app’s 
name (that is, the application menu). A second iteration lets it find 
the Preferences menu item within the application menu. Finally, the 
screen reader tells the Preferences menu item to perform the press action.

https://developer.apple.com/library/archive/documentation/Accessibility/Conceptual/AccessibilityMacOSX/OSXAXmodel.html#//apple_ref/doc/uid/TP40001078-CH208-TPXREF101

Sounds like UI automation to me. Regardless of what they named it.



And for ATK which is Gtk's library: 
https://developer.gnome.org/atk/stable/AtkAction.html

"AtkAction should be implemented by instances of AtkObject classes with 
which the user can interact directly, i.e. buttons, checkboxes, 
scrollbars, e.g. components which are not "passive" providers of UI 
information."

Sounds like UI automation to me even if accessibility centric.



 From QT with regards to accessibility support on Windows:

"Also, the new UI Automation support in Qt may become useful for 
application testing, since it can provide metadata and programmatic 
control of UI elements, which can be leveraged by automated test suites 
and other tools."

https://www.qt.io/blog/2018/02/20/qt-5-11-brings-new-accessibility-backend-windows

Linux support is hard to find anything solid on for QT, but they do have 
support for ATK bundled with it as part of AT-SPI2 specification.


More information about the Digitalmars-d mailing list