Take pictures like a PRO.
The iPhone camera is one of the most relevant digital cameras on the market today. With over 1 billion iPhones in the world, cell phone cameras are replacing traditional pocket cameras. They have good quality standards, it’s easy to take good photos, and easy to carry. In addition, apps to retouch and share photos are very popular.
With the new iPhone 7 camera, Apple reinforced its commitment to photography. They improved it technically and added the double lens in the Plus model that allows us to have dual focal length. This was when we realized that the native camera app didn’t take advantage of all the potential it has. We wanted to improve the user experience and help the regular user to approach professional photography. Lastly, AVFoundation framework was updated, giving developers more options developing for a camera app, and turning Camille’s dream into reality.
Camille was born as a product of Lateral View’s LABS 55, to bring to users features such as the ones you can find in a DSLR camera.
As an internal product of LAB55, we started with a research centered on the possibilities that the updated Manual Capture API of iOS gave us. Then we changed our focus, we analyzed compact cameras and professional SLR cameras; digital and analogic models. We wanted to know what we could transfer of the experience of taking professional photographs to our mobiles. We made a list of all the features that our camera should have, with help of professional photographers.
A wireframe prototype was made and we started a Guerrilla Testing. It iterated through 6 versions, until finding the desired camera application. We understood that we wanted to make a product for amateurs who wanted to learn and belong to the the world of professional photography. We found out that the camera should offer more options, while still being simple and easy to use. Also, that the user should be able to learn with the app, as a medium step to turn to the SLR.
Afterwards, Personas Design was the method we used to obtain the a user archetypes to be able to understand users’ needs. In the App stores we found great amount of retouching, filters and post-production apps, but no app that was able l to control, improve, and teach users how to improve photos at the time of the shot.
We wanted to recreate the look and feel of the dials used by the analog reflex cameras, with sounds and vibration. The entire application would be controlled with these digital dials. The development and design teams made a prototype to test it and validate the experience.
The tests allowed us to discard very ambitious functionalities that didn’t give the expected result or good user reception. As the project progressed, Camille’s operation had to be reformulated. But each iteration was closer to the final the product.
The most important thing in our application was how a photograph would be taken. We wanted an intuitive interface, with all the functionalities at hand,while remaining clean and clear. This was why the most important thing for Camille was the composition area. The automatic photo, (where the user leaves the decision-making to the app), should also be accessible.
The interface design with which the user was going to interact, was focused on two very clear objectives: ease and evoke. We wanted the user to interact with the entire application by holding the phone with one hand and being able to distinguish the function they were controlling.
We used the Rajdhani font, which evokes the engravings on the camera bodies. For the color palette we chose different shades of gray evoking materials such as aluminum with red accents. To complement this, a simple line iconography is perfectly integrated achieving a clean and sophisticated interface.
All brand identity was worked on by Lateral View: The naming, iso-logotype, and graphic resources. For the app icon a synecdoche was performed, with lines and ta color accent that helps users l remember the Camille brand.
Apple’s new AVFoundation Capture APIs allowed us to build powerful functionalities into our app, like setting Exposure, White Balance and ISO, changing point of Focus, and even capture Live Photos, among other features. When we combined this with the new iPhone 7 camera systems, we knew it was time for Camille to empower the photography experience in iOS.
As we were developing Camille, we realized that the iPhone camera was able to take much more data per capture than just a regular JPG image file, that’s why we began taking RAW image data from each capture and save it into a separate file that could be accessed just by plugging in the iPhone into any computer. RAW files contains a collection of unprocessed and uncompressed data, extremely valuable for photographers.
As informative and challenging it was to experiment with the new AVFoundation APIs, it was also very nurturing to start implementing XCUIApplication, XCUIElement and XCUIElementQuery classes into our testing environments.
This simplified the process of adding a test scheme to a project that had already started as part of a LAB55’s prototype experiment, and now wanted to become a final product.
Combining this technology with Unit Testing proved to be very efficient, helping us Identify testable core user stories, developing more meaningful tests, and maximising test coverage in a short period of time.
Camille was launched on the AppStore as a free app during international Camera Day. Our target consisted of frequent Instagram users, so we made a campaign for this social network in Argentina, host country of Labs 55. We created tutorials with iphoneography tips that advertised Camille’s manual functions. On the Brand Account we reposted pictures taken with the app and share with the hashtag #CamilleApp, to show that everyone can tell their stories with professional photos. We also contacted local influencers to share their Camille photographs, and helping us gain 7k followers in the first two weeks.