Everyone is talking about AI at the moment, especially when it comes to foundation models, the sort of thing where you can feed in some words and come up with an image. Stable Diffusion is an example of this, spewing out artwork based off whatever words you throw at it. We’ve showcasedplenty of Stable Diffusion-generated images right here.

These types of AI models are trained on huge amounts of data, giving them the power to interpret your input and produce something that captures the nuances of what you’ve said. There are masses of parameters, so this type of tech is usually dependent on the cloud to resource it.

Qualcomm, however, has demonstrated Stable Diffusion running on an Android smartphone, powered by the Snapdragon 8 Gen 2 hardware. That’s the platform behind a number of current-gen smartphones, such as theSamsung Galaxy S23 Ultraor theOnePlus 11.

Of course, Qualcomm had to take the Stable Diffusion models, shrink, optimise and compile them, so they run on Qualcomm’s AI Engine on the phone rather than in the cloud. You can find a lot more information ofhow Qualcomm did it in its blog.

The result is that you can feed in a request to Stable Diffusion and get a result in under 15 seconds; yes, it’s only of limited output resolution (512 x 512 pixels), but the fact you’re getting a result from the AI running locally rather than in the cloud is significant.

Why is it significant? Because Qualcomm isn’t really demonstrating that you can run Stable Diffusion on a phone. It’s really a demonstration of what an AI model can do without relying on huge processing resources in the cloud. Because when AI models can run on phones, optimised for local hardware, it means you can use them all the time on that device in your pocket.

This is obviously a demo with a hook into Stable Diffusion and the fun images that it produces, but it’s not the first time we’ve seen cloud-based AI tech move to local processing. Google has done similar with some of its AI models, like language and translation, meaning you’re not dependent on a good data connection to use those sorts of services.

Qualcomm says that future Snapdragon hardware will be even better at these sorts of AI tasks, and points to a future where any Snapdragon-powered device could make use of AI models to enhance the user experience. That might not be a phone, it could be your car, an XR headset or anything else.

The takeaway point is that those fun AI models that you’ll have witnessed online aren’t limited to just something you’ll encounter in a browser. No, as we move into the future, these sorts of AI systems can run offline, and the possibilities are enormous.

Jarvis, you there?