Design Example: Connection Engine
Below is an example I wrote after reflecting on how I wish search engines were better at helping people find mental health support online, even when they don't know they need it. It was a fun exercise in imagining something for-profit would likely never do for liability reasons, but perhaps ought to do if they cared about the wellbeing of humanity. Because I wrote it quickly (<1 hour), I didn't spend enough time on research on related technologies, such as the new proliferation of automated mental health tools that have come on the market recently. It could also do a better job talking about the specific groups that might be harmed by this idea, rather than generically talking about them. Because of these sources of improvement, I would knock it down to a 3.5.
Connection engine
Amy J. Ko
One important implication of today's reading on seeking information Links to an external site. is that search engines are far from comprehensive in their support. There are so many aspects of information seeking that they don't support -- figuring out what information to seek, why to seek it, how to synthesize all of the information that's found -- mostly what search engines do its just retrieve things. They have no concept of what we're actually looking for, or how our questions might be misframed. They simply retrieve whatever our queries describe.
Perhaps this is fine. If I say I want funny cats, there's not much ambiguity in that; I'd like some funny cats. But other queries, particularly those that verge on questions one might ask a therapist, aren't nearly as innocuous. Suppose, for example, that I asked, "why do I have no friends?" I have definitely Googled that before, and what the web generally has to offer isn't particularly helpful. Some ad-supported self-improvement websites offer some unhelpful advice. Style magazines have a columns that tear through a bunch of reasons why you might be an asshole. YouTubers have videos on why it's okay to not have friends. None of these, however, speak to the particular reasons why the person who wrote the query is asking. And that's not surprising: no one on the internet knows the person writing the query, so why would any of the internet's content have an answer specific to their situation?
I propose that we redesign search engines to detect questions that are about a person, and instead of offering search results, offer a different kind of targeted information. Here's how it would work.
First, to detect questions that are about a particular individual in English, we would use simple interrogative adverbs and personal pronouns like "I" and "we". "Why am I", "Why do I", "Why does no one like me". This pattern of questioning, where the subject of the question is the person writing the query, is almost certainly something that the internet's content cannot answer, and so the engine should redirect the person asking back to people who can help.
This redirection is exceptionally difficult to do without doing harm. But it's also not clear that the generic search results provided aren't also harmful, as the web is full of generic, poor quality advice. And so here is my best idea for how to handle it with care, while minimizing the risk of harm or self harm.
The first thing the search engine would do is ask if there's someone that the person can talk to, such as a friend, partner, or therapist, and if there is, encourage them to talk to that person instead of consulting the internet. If there is not, then the engine would recommend phone and text crisis lines that specialize with different circumstances, whether self harm, loneliness, or general mental health. These would be lightly filtered against the query if there's information in the query that would suggest the relevance of a particular query, but otherwise, all would be offered.
A third option, if someone does not have someone, or want to use a dedicated support resource, is to recommend a curated set of community supports. This index would include subreddits, private social media groups, or local groups that have submitted their community as a resource available for people who's queries suggest a need for support. These communities would provide information for how to contact the community, a description of the scope of support they can provide, and information about the type of people that are in the community. Because such communities are often ephemeral, these resources would need to be refreshed periodically (e.g., every quarter), ensuring that when people contact them, there are not defunct, or have not changed in a significant way.
As a fallback, the person writing the query could ask for generic search results, but with a warning that they will not necessarily provide advice or help that's particular to their circumstances, and may be misleading. This fallback would not be a normal listing of search results, however: it would maintain a prominent link back to the community resources initially offered, in case the searcher decides that they might be more heplful after all. And the search results would prioritize the vetted support lines and communities in search results, rather than whatever words happen to be relevant in the search engine's normal ranking algorithms.
There are several other aspects of this design proposal that need special care. For example, organizationally, this entire search engine interaction flow should be managed and overseen by a team of mental health specialists, so they can find language that is the most likely to affirm and support, as well as vet support resource recommendations. This team should have a diverse set of expertise, covering all of the possible types of support that people might seek. And the team should be engaged in regularly auditing search results to ensure that this flow is not doing harm.
There are many risks with this idea. For example, some people use search engines as private spaces (even though in most cases they are not private to the corporation running them). This dialog-based approach to offering search results could raise justified fears about corporate data collection on people with mental health struggles. The implementation of this would need to carefully communicate exactly what happens to their responses in this dialog with the search engine, and to what extent their activities are tracked. (Ideally, nothing would be tracked).
There's also a risk that people would resent the loss of agency in search. The world has come to (falsely) see search engines as neutral tools, and so explicitly reframing them as a mediator of what information people can get, and intentionally redirecting them away from some information, might result in some backlash.
Despite this risks, the current reliance on generic search results for support does not seem sufficient. People deserve to find communities of support, even when they do not explicitly seek them, and search engines now play an important role in helping them find those supports.