Skip to content

windows: ability to output to the currently running screenreader #2

@danielw97

Description

@danielw97

Hi,
First off, great work on this utility and thanks for making it.
I was first made aware of it via your recent reddit post on the local llama subreddit.
As someone who's been using the cli to interact with llms over the past few years, it's great to have an accessible graphical option that isn't a webui.
It's still early days of course, although a small wishlist item of mine is to be able to output model responses to a screenreader such as jaws or nvda if possible.
I'm not sure how things are on Mac, although on windows the main option currently is to be able to output via the system's main tts which is probably sapi.
I'm not sure how easy something like this would be to implement whilst still staying cross-platform, though.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions