
We are lucky to live in an age where we have at our fingertips (phone or tablet) far more computing power than would have seemed possible on a powerful desktop machine a decade ago.
Applications like computer aided design (CAD) have traditionally demanded more and more power, as the ability to envisage what is in the mind is translated to 3D drawings…and even virtual reality videos.
Photography is one of the fields in which the full power of the technology is readily available to the non-specialist.
Those who follow my blogs – and thank you – will know that I take a lot of photographs. Recently, I discovered that my library on iCloud held 140,000 images! Most of them I’ll never see again, and it’s a pain to sort through even a fraction, so I’ve started being ruthless with how many I keep.
The problem is the supercomputer in my pocket pretending to be a camera. With a bit of human direction, it’s remarkably good at capturing what is around me. Because it’s my phone, diary, dictation machine, notebook and many other things, I’ve always got it with me.
The power of modern phone cameras raises a few questions, chief of which is whether we still photograph ‘the real’?
There is a big difference between the image capture stage and its subsequent processing. If I wish, I can set the camera part to ‘filter’ what is there as it takes the shot. The downside is that I’ve therefore lost a lot of what ‘could have been there’ by post-processing the image, later. For this reason, I normally let the camera take the shot, ‘as it sees it’, adjusting only the composition; and that usually means the range from ‘telephoto’ to ‘close up’.
Phone cameras are poor at zooming into scenes. It’s asking a lot from those tiny lenses – computer-backed or not! But there is a style of landscape that responds well to the limitations of the phone camera. Consider the example below:

It’s a pleasing shot, and not posed. I simply kept my distance and let the camera reach into the lives of the two lovely people enjoying their moment; hopefully without intrusion. I have no idea who they were.
But what about this one:

It’s the same photograph, but processed after the event – and on my iPhone. Here, I’ve deliberately modified the look and feel of the original to tell more of the story – the link between the sun in the sky and the ‘receiving’ humans on the shoreline.
You could say it’s slightly ‘alchemical’ in its symbolism; combining ‘Earth’ – the beach; ‘Air’ – the sky; Water – the ocean; and Fire – the sun, now sporting four ‘wings’. The imagery is clear: the humans, below, are recipients of one of the best things in life (Love) via the gift of the ‘elements’, led by the Sun.
What is really there and what’s not? It’s impossible to discuss without getting a bit philosophical. We all see the world slightly differently. My eldest son is slightly colour-blind. He can’t see certain greens. Is his reality less? No two people will actually see the same scene, anyway.
The difference is not limited to perception of colour. We all react to what we see by modifying it with how we feel. We can’t change how our eyes biochemically observe; but we can deliberately choose to see something in a landscape that’s not there. But it might be there inside our powerful imaginations.
If what I’m trying to capture is the ‘feel’ of the event; the mood, even, then what I want often lends itself to the palette of modern editing tools, many of which are immediately available after taking the original photo. A photographic purist would say that I’m altering things; I’m changing what’s there.
Talk to most photographers, and they will say that the suite of digital editing tools they possess is simply the updated ‘dark room’ of days gone by – that photographers have always had ways to enhance what the camera takes.

In the beach photo above, I’ve reproduced the shot the camera took. No editing, except cropping the image for my purposes. Everything else is unmodified – and I had no desire to impose my own view as to how it should be seen.

The final shot, above and below, is an example of deliberately going out to find an image that matches a desired state. I wanted to find something natural that would suggest the germination of an idea and its transformation in the mind to a solid and workable reality.

Walking Tess, our collie, along this Northumberland beach as the sun was setting, I glimpsed the scene above. The well-defined ripples led the eye to the water, whose eventually depth absorbed them. The downward gradient of the beach took away the resulting flow from the small pool and it joined the sea.
For me it was a perfect metaphor, but left in its natural state, might not have conveyed the purpose with enough impact. Experimenting with the depth of colour and object ‘definition’, I was able to create something with much more impact.
Real or right? Only the reader can decide in each situation… But the modern photographer now has the tools to be both picture-taker and illustrator; and that can only be a good thing.

The final image, above, is an attempt to create a piece of art from a photograph. The original photo has been dramatically altered to create a ‘dreamy effect’. No ‘real’ photo would have these colours in it, but I wanted a ‘fantasy’ scene.
©Stephen Tanham 2021
Stephen Tanham is a Director of the Silent Eye, a journey through the forest of personality to the dawn of Being.
http://www.thesilenteye.co.uk and http://www.suningemini.blog
This is why most of us have so many photographs… How on earth can we choose which ones to delete?
LikeLiked by 1 person
Thank you, Jaye. It’s a real problem; and one that there is probably a race to solve by the big AI companies. Google are likely to be front-runners, as its just the kind of ‘killer’ app that we all need. We need a ‘whole environment’ that absorbs all our photos. Then refines the selection of what we want to get rid of, before it deletes anything. It would first show us photos of people and give us the choice of keeping everyone that looks like person x. When we delete a set of shorts of, say, sunsets, it would ask if we wanted a stream of fast-delete (return key) actions to take out whole sets of them, but let us, in real-time, mark those flowing past that we wanted to keep. Anything we’ve already marked as a Favourite would be kept. The AI would also recognise sequences of specific holidays, asking us if we wanted to keep certain blocks of dates on this basis. Google are very good with selecting and deleting blocks of old Gmail, but care is needed. When the process was finished the AI would show you random images from the vast set it proposed to delete. It would keep that set for say a month, feeding you random image sets during each day so you had a final chance to save-from-mass delete. Just some thoughts much of the above is available in other forms at present. It just needs the big tech people to bring it together – and its a race, because we’ll all want it, tomorrow! 😎
LikeLiked by 1 person
What do you think about those USB’s designed for images, Steve?
LikeLiked by 1 person
Are these new devices, Jaye? I may have missed something if they are?
LikeLiked by 1 person
Apparently this particular USB will search and collect all your images. Sounds like a great idea as mine are all over the place, but it doesn’t say if finding them on the USB is easy too…
LikeLiked by 1 person
Thanks Jaye. I wonder how it would cope with cloud storage – which is where most of our pictures live, these days?
LikeLiked by 1 person
Good question, Steve. I don’t use cloud storage as much as I should…
LikeLiked by 1 person
It’s so easy, Jaye. Just like having another drive.
LikeLike
There seems to be so many and some are complicated (for me, anyway) which is your choice, Steve?
LikeLiked by 1 person
I’m wall to wall Apple, Jaye. So I use iCloud. Microsoft have their own, too.
LikeLiked by 1 person
I’m on the fence, trying to decide which is better, Apple or Microsoft…
LikeLiked by 1 person
I would go with the make of your computer. Usually, the compatibility is better. Apple have a better track record of personal security, though…
LikeLiked by 1 person
😍
LikeLike
I love your photographs Steve. I am still laying with mine and haven’t tried anything other than ‘normal’ pictures yet. Your second picture looks like an oil painting of the first. Lovely.
LikeLiked by 1 person
Thank you, Di. The App ‘Snapseed’ by Google is free. It’s also brilliant and can be used on your laptop or desktop. I use it all the time, now.
LikeLike
Thanks
LikeLike
Hi Steve,
This is something I’ve thought about a lot. Certain traditional photographers have always hated the idea of filters and things like High Dynamic Range photography because they distort reality. I do admire the skill of the professional photographer, but we all see things differently, and who is to say whose vision of reality is the more valid? As you point out, a colour blind person sees things in ways others don’t because of a slight variation in the programming of their senses, but their reality is no less real than anyone else’s. And imagination alone can, and I’m convinced does, dramatically alter the way we view a scene by engaging the emotions.
I find my cameras never capture things as I saw them, but I can tweak them with all these marvellous tools and filters we have in the digital darkroom nowadays to bring the image back to the way we felt it. Or sometimes, as you show here, the camera sees things in a multitude of ways, and it takes experimentation to reveal the most wonderful insights from what at first might seem the most mundane of images.
Using cameras, it’s easier to avoid the cloud storage problem. I use a couple of removable drives, for backing up, which forces me to be selective. I wonder how much energy it takes to keep all the world’s forgotten images alive.
Great pictures as always.
LikeLiked by 1 person
Thank you, Michael. A shared passion, I think. And a similar set of evolved views. It’s really a photographer’s paradise, these days. We are so lucky!
LikeLiked by 1 person