(Above: Collie-heaven… beach, ball, and human to chuck it…)

We are lucky to live in an age where we have at our fingertips (phone or tablet) far more computing power than would have seemed possible on a powerful desktop machine a decade ago.

Applications like computer aided design (CAD) have traditionally demanded more and more power, as the ability to envisage what is in the mind is translated to 3D drawings…and even virtual reality videos.

Photography is one of the fields in which the full power of the technology is readily available to the non-specialist.

Those who follow my blogs – and thank you – will know that I take a lot of photographs. Recently, I discovered that my library on iCloud held 140,000 images! Most of them I’ll never see again, and it’s a pain to sort through even a fraction, so I’ve started being ruthless with how many I keep.

The problem is the supercomputer in my pocket pretending to be a camera. With a bit of human direction, it’s remarkably good at capturing what is around me. Because it’s my phone, diary, dictation machine, notebook and many other things, I’ve always got it with me.

The power of modern phone cameras raises a few questions, chief of which is whether we still photograph ‘the real’?

There is a big difference between the image capture stage and its subsequent processing. If I wish, I can set the camera part to ‘filter’ what is there as it takes the shot. The downside is that I’ve therefore lost a lot of what ‘could have been there’ by post-processing the image, later. For this reason, I normally let the camera take the shot, ‘as it sees it’, adjusting only the composition; and that usually means the range from ‘telephoto’ to ‘close up’.

Phone cameras are poor at zooming into scenes. It’s asking a lot from those tiny lenses – computer-backed or not! But there is a style of landscape that responds well to the limitations of the phone camera. Consider the example below:

(Above: lovers on the shore?)

It’s a pleasing shot, and not posed. I simply kept my distance and let the camera reach into the lives of the two lovely people enjoying their moment; hopefully without intrusion. I have no idea who they were.

But what about this one:

It’s the same photograph, but processed after the event – and on my iPhone. Here, I’ve deliberately modified the look and feel of the original to tell more of the story – the link between the sun in the sky and the ‘receiving’ humans on the shoreline.

You could say it’s slightly ‘alchemical’ in its symbolism; combining ‘Earth’ – the beach; ‘Air’ – the sky; Water – the ocean; and Fire – the sun, now sporting four ‘wings’. The imagery is clear: the humans, below, are recipients of one of the best things in life (Love) via the gift of the ‘elements’, led by the Sun.

What is really there and what’s not? It’s impossible to discuss without getting a bit philosophical. We all see the world slightly differently. My eldest son is slightly colour-blind. He can’t see certain greens. Is his reality less? No two people will actually see the same scene, anyway.

The difference is not limited to perception of colour. We all react to what we see by modifying it with how we feel. We can’t change how our eyes biochemically observe; but we can deliberately choose to see something in a landscape that’s not there. But it might be there inside our powerful imaginations.

If what I’m trying to capture is the ‘feel’ of the event; the mood, even, then what I want often lends itself to the palette of modern editing tools, many of which are immediately available after taking the original photo. A photographic purist would say that I’m altering things; I’m changing what’s there.

Talk to most photographers, and they will say that the suite of digital editing tools they possess is simply the updated ‘dark room’ of days gone by – that photographers have always had ways to enhance what the camera takes.

(Above: This simple ‘beach and tide’ scene has all it needs to take me back to the mood of the moment when it was taken. I wouldn’t want to enhance it. For me, it’s a perfect memory)

In the beach photo above, I’ve reproduced the shot the camera took. No editing, except cropping the image for my purposes. Everything else is unmodified – and I had no desire to impose my own view as to how it should be seen.

The final shot, above and below, is an example of deliberately going out to find an image that matches a desired state. I wanted to find something natural that would suggest the germination of an idea and its transformation in the mind to a solid and workable reality.

The theme was to convey the ‘birth of an idea’, using the mutation of form from the left-over sea ripples (solid) to water, to the eventual flow back into the ‘great mother’ – the sea.

Walking Tess, our collie, along this Northumberland beach as the sun was setting, I glimpsed the scene above. The well-defined ripples led the eye to the water, whose eventually depth absorbed them. The downward gradient of the beach took away the resulting flow from the small pool and it joined the sea.

For me it was a perfect metaphor, but left in its natural state, might not have conveyed the purpose with enough impact. Experimenting with the depth of colour and object ‘definition’, I was able to create something with much more impact.

Real or right? Only the reader can decide in each situation… But the modern photographer now has the tools to be both picture-taker and illustrator; and that can only be a good thing.

The final image, above, is an attempt to create a piece of art from a photograph. The original photo has been dramatically altered to create a ‘dreamy effect’. No ‘real’ photo would have these colours in it, but I wanted a ‘fantasy’ scene.

©Stephen Tanham 2021

Stephen Tanham is a Director of the Silent Eye, a journey through the forest of personality to the dawn of Being.

http://www.thesilenteye.co.uk and http://www.suningemini.blog

18 Comments on “Real or Right?

    • Thank you, Jaye. It’s a real problem; and one that there is probably a race to solve by the big AI companies. Google are likely to be front-runners, as its just the kind of ‘killer’ app that we all need. We need a ‘whole environment’ that absorbs all our photos. Then refines the selection of what we want to get rid of, before it deletes anything. It would first show us photos of people and give us the choice of keeping everyone that looks like person x. When we delete a set of shorts of, say, sunsets, it would ask if we wanted a stream of fast-delete (return key) actions to take out whole sets of them, but let us, in real-time, mark those flowing past that we wanted to keep. Anything we’ve already marked as a Favourite would be kept. The AI would also recognise sequences of specific holidays, asking us if we wanted to keep certain blocks of dates on this basis. Google are very good with selecting and deleting blocks of old Gmail, but care is needed. When the process was finished the AI would show you random images from the vast set it proposed to delete. It would keep that set for say a month, feeding you random image sets during each day so you had a final chance to save-from-mass delete. Just some thoughts much of the above is available in other forms at present. It just needs the big tech people to bring it together – and its a race, because we’ll all want it, tomorrow! 😎

      Liked by 1 person

  1. I love your photographs Steve. I am still laying with mine and haven’t tried anything other than ‘normal’ pictures yet. Your second picture looks like an oil painting of the first. Lovely.

    Liked by 1 person

  2. Hi Steve,

    This is something I’ve thought about a lot. Certain traditional photographers have always hated the idea of filters and things like High Dynamic Range photography because they distort reality. I do admire the skill of the professional photographer, but we all see things differently, and who is to say whose vision of reality is the more valid? As you point out, a colour blind person sees things in ways others don’t because of a slight variation in the programming of their senses, but their reality is no less real than anyone else’s. And imagination alone can, and I’m convinced does, dramatically alter the way we view a scene by engaging the emotions.

    I find my cameras never capture things as I saw them, but I can tweak them with all these marvellous tools and filters we have in the digital darkroom nowadays to bring the image back to the way we felt it. Or sometimes, as you show here, the camera sees things in a multitude of ways, and it takes experimentation to reveal the most wonderful insights from what at first might seem the most mundane of images.

    Using cameras, it’s easier to avoid the cloud storage problem. I use a couple of removable drives, for backing up, which forces me to be selective. I wonder how much energy it takes to keep all the world’s forgotten images alive.

    Great pictures as always.

    Liked by 1 person

Leave a Reply to Steve Tanham Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: