Photography / Still Photos

No Filter: The New Science Transforming Your Selfie

The cutting-edge tech changing the future of photography.

Woman in sunglasses taking a selfie on a city street

Can you believe the first camera phone came out 20 years ago? Back then, a built-in 110,000-pixel camera with a 256-colour display was super-futuristic. Who knew there was life beyond grainy images that looked like blotchy computer graphics? Fast forward to 2019 and smartphones like the Galaxy A Series snap photos with 12 million pixels and futuristic colour QLED displays – almost as good as those taken on a professional camera.

Such advancements mean there will be no such thing as a ‘straight photograph’ in future, says Mark Levoy, Professor Emeritus of Computer Science at Stanford University. ‘Everything will be an amalgam, an interpretation, an enhancement or a variation – either by the photographer as auteur or by the camera itself,’ he says.

In fact, the camera of the future will be more of an app than a device, says former Columbia University lecturer and professional photographer Taylor Davidson, and it will compile data from an array of sensors and combine this with visual data to develop a new type of photograph.

Intrigued? This is the tip of the high-definition iceberg, and the Galaxy S10+ and A series are part of the revolution to push the boundaries of your smartphone selfies.

A sillouette of a man against some fairy lights at night

Bokeh and the black of night

The Galaxy S10+ has an incredibly advanced setting called bokeh, which enables you to capture portrait photos in exceptional detail. With bokeh, the camera will automatically focus on the most pertinent object in the scene, naturally blurring out elements outside of its focal point. This isn’t done with a lens as in traditional camera aperture, but rather through advanced computational processing of the image after it has been captured.

Similarly, the S10+ also has Dual Aperture mode that allows you to take incredibly detailed photos at night when there is less light available. This is achieved through advanced machine learning when the processing unit of the phone uses artificial intelligence and data to learn how to complete new tasks, rather than through specific instructions. This process means the S10+’s cameras can switch between two aperture sizes depending on the light, combining multiple frames into one shot.

Three steps of a man taking a selfie to create an Emoji

3D sensors and machine learning

The Galaxy S10+ and A series incorporate 3D sensors and machine learning – it’s what allows certain Instagram and Snapchat filters to rebuild your face into a 3D model. In the near future, however, machine learning will be so advanced that it will be able to recreate the whole image in immersive 3D.

Although still in the hypothetical stage for now, an exciting possibility is that our smartphones will be able to observe the real world and understand it without help from 3D sensors. This development requires devices, through algorithms, to use data from old photos as a guide in their continued learning.

A woman doing a cartwheel across 3 phone screens

Rotating cameras and a lens as wide as the eye can see

The Galaxy A80 has a revolutionary camera that can both rotate 180 degrees for the perfect selfie and capture ultra-wide landscape images.

The Galaxy A80’s triple camera setup has a 48-megapixel main camera, an 8-megapixel ultra-wide sensor and a 5-megapixel 3D depth sensor, all of which can rotate 180 degrees, using only one cleverly designed motor. It means that there is the same incredible quality of photo for selfies or video calls as for outward facing panoramic shots.

The device also has an ultra-wide lens with a 123 degree field of vision, as much as the human eye, and scene optimiser to recognise and process up to 30 different scenes. This means that pictures will be perfectly framed no matter what situation you’re in.

Woman pointing to another woman within a smartphone

Tilt-shifting to correct a distorted view

Cameras are learning to alter the perspective of an image to improve it - a process called tilt-shifting. In tilt-shift cameras the lens is angled to make up for where a person is positioned with the camera. So if you were sitting down and looking up, it would automatically correct that in the image, rather than recreating a distorted view.

As it stands, the lenses in smartphone cameras are too small to recreate the bigger lenses of professional photographers but, if this process can be perfected, we may soon see photography on phones equivalent to professional digital single-lens-reflex - or DSLR cameras.

Get a taste of the future today with the Galaxy A80 smartphone

Capture images that stretch as wide as the human eye can see with the Galaxy A80.

Galaxy A80

Galaxy A80

Galaxy S10+

Galaxy S10+

Read these stories next