Amazing new Google camera app adds shallow depth of field like a full-frame DSLR.














I recently picked up a new iPhone 5S primarily for the newer camera and I’m happy to report I’m pretty amazed with the quality of images I’m getting. (5S review coming soon)  In fact, I used my iphone more than my Fuji X-pro1 on a recent trip. It’s faster, easier, the resolution is more than enough for web posting and it might even be enough for some half-decent printing, *gasp* (more on that later when I do some printing). As time marches on the smartphone continues to evolve into a formidable image making tool. I say “image making” rather than “photo taking” because smartphones have the computer processing power to fully edit and make images on the fly without ever having to download the photos to a computer. Dedicated cameras can’t do this so I call them photo taking devices. The ability to edit photos and create images within the smartphone has been an undeniable advantage over dedicated cameras. Still, smartphones fall short on so many other levels. There is yet a convenient telephoto experience, which leaves us with a fixed wide angle focal length that lands somewhere around the 30mm equivalent to a full frame 35mm sensor. The other huge thing missing is the creamy shallow depth of field we get from large sensor cameras… oh wait…

Google has just innovated a new camera app that calculates “depth maps” creating realistic bokeh, or Lens Blur which is “a new mode in the Google camera app lets you take a photo with a shallow depth of field using just your Android phone or tablet. Unlike a regular photo, Lens Blur lets you change the point or level of focus after the photo is taken. You can choose to make any object come into focus simply by tapping on it in the image. By changing the depth-of-field slider, you can simulate different aperture sizes, to achieve bokeh effects ranging from subtle to surreal (e.g., tilt-shift). The new image is rendered instantly, allowing you to see your changes in real time.”

Say whhaaaaat?!

This is unbelievable. I think Google is already eating Lytros lunch! If you are interested in all the technical algorithm mumbo-jumbo and how they achieved this truly amazing feature, head over to the Google blog posting and read all about it. It’s just a matter of time before this feature hits iOS and when it does I’ll have one less reason to take my DSLR out of the bag.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s