On LLM/AI And Its Impact on UI/UX

Ruben Orduz
4 min readApr 30, 2023

--

Created with MidjourneyAI

For almost 50 years, starting from the time of Xerox Labs, engineers and scientists, there has been an ongoing effort to develop a collection of patterns and a formal field of study for human-computer interaction.

As computers, programs, devices, have all evolved tremendously, so have the need for user experience practitioners to come up with novel concepts and metaphors so that end users have an enjoyable experience.

We, as an industry, have created a mountain tools, libraries, frameworks that abstract away complexity and awkwardness to present the user with an easy to use and intuitive interface. We have adopted concepts from psychology, human development and cognitive science to help us better understand our users and how they use whatever device, web app or even programming interfaces.

All of those things are rooted in the fact that humans don’t speak binary or hexadecimal and computer don’t speak in human language. So first we needed a graphical way to interact with the computer, because command line interaction is highly unintuitive and requires previous knowledge of the system. So engineers invented the UI, and then the mouse, and so forth. But as computers gained power and capabilities, tremendously so, new patterns, new behaviors, even color palettes had to be implemented.

Created with MidjourneyAI

For the last 15 years or so, at least for web apps, most of the effort has been concentrated in making CRUD workflows usable, intuitive and accessible. But much like before this is because the user can’t be expected to be experts at SQL, know the database schema, field names, etc. to retrieve the information the need. Then there is also a need for “single pane of glass” dashboards to unify, at least as far as the user is concerned, disparate data sources in a coherent and cohesive way.

[Record scratch sound]

With recent developments in LLMs and AIs a lot of the underlying needs for UIs and their UX are negated, in my opinion. Projects like ChatGPT, Bard, Bing (and all the current 3rd party offspring) demonstrate that we have reached a point in technology when computers understand and communicate with humans with ease. As a user, I don’t need three buttons, a dropdown menu and a search box, I can just type in natural language what I’m looking for, which can be an incredibly complex operation behind the scenes, and the system understands the query, then internally parses it, makes inferences and returns what it believes is the most accurate answer. I don’t need to know exact syntax, flags, databases, etc.

As user I can type “create project Overlord, add me as the owner, add three new tasks to it: gather intelligence, discuss with allies, implement” and the system knows what to do. “list all the projects I’m part of sorted by due date soonest first” and I could go in as much complexity as I wanted and the system would know what to do.

Because it requires no special skill, or knowledge of the system, prompt-based systems obviate the need for much of what has been built and designed in the last two decades in terms of UI and UX. It goes further though, because prompt-based MLL/AI relies in the use of natural written language, it’s the most accessible (in the a11y sense) interface there is. And for those with movement impairments a voice to text tool will even field. No more tab-order pains, no more contrast woes or aria- worries. Natural language in, natural language out.

Clearly, not all data and information can be described textually. The need for charts, grids, lists, etc. will remain. But thanks to all the work though the years, this is a “solved problem”. We have widgets, components, etc. the system can leverage. I posit in 3–5 years, most web apps, business intelligence, analytics, applications will look something like a Jupyter Notebook (augmented like Google Colab, Noteable, for example). But it’s too early to tell.

In closing, the latest generation of MLLs and AIs are bringing about a lot of promise, but also a lot of potential for disruption and displacement. I do not have a crystal ball, so I can’t see what’s going to happen 3–5 years out, but I believe a lot of disciplines in tech and fields adjacent to it will have a reckoning including the field of UI/UX.

--

--

Ruben Orduz

Software, 3D Printing, product reviews, data, and all things AI/ML.