The story behind LightSpace, the AR app that lets you paint faces on appliances, light flowers on fire, and decorate your friends.
On September 28th—an unusually hot day for Brooklyn at this time of year—my hosts asked me to huddle around a video on an iPhone. In it, a girl holding her phone up twirled through a room full of shimmering, suspended snowflakes and cried, “This is my dream! I’m in my dream!”
Aryn and Matt, founders of app development firm Logical Animal and inventors of this augmented reality drawing app, LightSpace, have created something spectacularly unique. It seems to evoke an explosion of wonder from whomever it touches and leaves grown adults giggling.
Naturally, I asked if I might try it out, and in doing, I got to hear the story of how it came to be.
Nostalgia-driven design
“Do you remember Harold and the Purple Crayon?” was the first thing Aryn asked me when we sat down to talk about the app. I did, and quite vividly. In this classic children’s book, Harold, the protagonist, uses a purple crayon with which he creates his world as he explores it. He draws a moon for light to walk by, trees to climb, and a ship to sail away in.
,
,
Harold and his crayon
“I wanted to write and draw and paint the world,” said Aryn. “I wanted to be Harold.” A few months ago, Aryn discussed this with Matt, and found out about his dream to walk through a Victorian space diorama—a la the 1902 film A Trip to the Moon—where stars hung on strings.
As people who make apps, the next step was rather obvious.
Turning up the whimsy
The day after Apple’s ARKit beta launch, Aryn and Matt began building LightSpace. At first, the app simply painted lines in space, but it quickly evolved. “We kept discovering that we had to make the app more whimsical,” said Matt. “People wanted sillier.” Testers would often draw faces on household objects and decorate their friends. “Decorate their friends?” I asked.
“Yeah, there are 3 things we see over and over. Moustaches, hats, and uh …” said Aryn. “Appendages,” said Matt, cutting her off. They both laughed.
The app needed more magic. On day one, the app drew a solid line in space. It looked like the user was extruding crinkly paint from their phone, which didn’t feel terribly satisfying. System resource limitations meant that the line sometimes had gaps as well. To conceal the gaps, Aryn and Matt added a spectral glow. The results were visually interesting, but then it got even better.
,
,
“The big moment for me was when we discovered the flame brush by accident,” said Matt. “The flame brush!” cried Aryn, clapping. While working on the glow, Matt had added a particle emitter that was too coarse and made drawings look like they had been set on fire. “We were like, ‘let’s keep that,’” said Aryn, laughing.
“This entire app really is a series of beautiful mistakes that we capitalized on,” Matt conceded. “At each point there was a technical limitation in the library where the performance of the system couldn’t do what we wanted it to do.” These constraints pushed them to invent brushes like the sparkler and one they call ‘black doom.’ It also gave them an entirely new perspective on depth perception.
,
,
Lessons in designing for AR
“Let’s start with depth,” said Aryn. “Depth perception doesn’t work on a flat screen because you no longer have binocular vision.” This makes it hard for users to tell how close they are to a line. This problem is compounded by the fact that phone screens are a slender window from which to view the world, and that users must move that tiny screen to draw. If not perfected, the experience can be befuddling.
There’s also the problem that it’s confusing to have the drawn line emit from the center of the phone screen. This is how they first designed LightSpace but when users touched the screen to draw, the line—which is actually an inch in diameter—it would block their sight. They’d have to back away for a better view and lose their place. Matt and Aryn solved this by having the line appear one third of a meter away from the phone, as if at the end of a paintbrush.
“We arrived at that exact distance through testing,” said Aryn. Drawing any closer felt oppressive whereas drawing at a full meter felt like painting with “a log.”
Users also initially had trouble telling where they were painting because the phone is both the drawing implement and the viewfinder. Try as they might, users would often get the Z dimension wrong and turn what they thought was a 2D painting into what looked like a series of Escher staircases.
,
Source: In Blog