<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
  <title>ai — Savva Pistolas</title>
  <subtitle>Writing about AI, alignment, systems thinking, cybersecurity, futurism, privacy, and more.</subtitle>
  <link href="https://pistolas.co.uk/feeds/tags/ai/feed.xml" rel="self" type="application/atom+xml"/>
  <link href="https://pistolas.co.uk/tag/ai/" rel="alternate" type="text/html"/>
  <id>https://pistolas.co.uk/tag/ai/</id>
  
  
  <updated>2026-04-05T10:09:26Z</updated>
  
  <author>
    <name>Savva Pistolas</name>
    <email>savva@pistolas.co.uk</email>
  </author>
  
  <entry>
    <title>AI as a detector of work that needn&#39;t be</title>
    <link href="https://pistolas.co.uk/work-that-need-not-be/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/work-that-need-not-be/</id>
    <published>2026-03-08T00:00:00Z</published>
    <updated>2026-03-08T00:00:00Z</updated>
    <summary>Can AI serve as our quiet advocate for rooting out poorly designed systems that sideline human experience in favour of performative artefacts that allude to productivity?</summary>
    <content type="html"><![CDATA[&lt;p&gt;Artificial Intelligence is the provision of an omni-capable tool that can be deployed seemingly anywhere in your life to produce instant, accurate, competent satisfaction of any requirement. Whether it’s anxiety quelling email drafts to reply to a complex ‘multi-stakeholder’ situation at work, or full-scale automation of your entire University degree - from labs, to reporting to reflection. AI fulfils requirement without fatigue, and without need for much affective input on your part. You can produce artefacts that fit the shape of ‘output’ for near-any system in work or study. Often this is labelled a productivity enhancer - enabling us to spin additional plates and optimise to the moon and back. As with the fundamental dialectical tradition where ‘progress’ and creation is inextricably linked with decay and destruction, let us reflect on what the contrary of our new era of AI productivity might be.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;In our days, everything seems pregnant with its contrary: Machinery, gifted with the wonderful power of shortening and fructifying human labour, we behold starving and overworking it; The newfangled sources of wealth, by some strange weird spell, are turned into sources of want; The victories of art seem bought by the loss of character.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;Large language models accomplish any task that can be ‘reduced’ to pattern discovery and subsequent exploration, so they absolutely smash coding, maths, chess, DNA, law, etc. Generally speaking, our species does well to produce machines that automate procedures, and AI is the most sophisticated iteration of this goal so far. Where AI succeeds at producing an effective artefact, the observer would do well to ask whether such an artefact was ever appropriate for a human being to produce in the first place. This piece explores whether AI can serve as our quiet advocate for rooting out poorly designed systems that sideline human experience and outcomes in favour of performative artefacts that allude to ‘productivity’ without any meaningful impact on our world.&lt;/p&gt;
&lt;h2&gt;Academia&lt;/h2&gt;
&lt;p&gt;Let us first look at academia in the UK up to at least MSc level for our assessment of pro or anti-human design; what was once an earnest commitment to development of knowledge derived from intrinsic motivation has become a hyper-marketised, cynical, and un-provenanced set of institutions that treat near-solely for profit. International students mill in and out of the country on restrictive visas, paying exorbitant fees to attend poorly planned and atomistic courses that are fulfilled by teaching assistants and professors who barely have the time to populate their material with the love and attention that good teaching needs. The focus is purely on an ‘output’ of degrees that can be leveraged in less economically developed nations, hung entirely on the walking-dead reputation of institutions that no longer have the capacity to separate their knowledge production from their profit production.&lt;/p&gt;
&lt;p&gt;In such institutions the discretion, discussion, and initiative that come with true learning are inconveniences to be innovated away. What is really desirable and effective is for students to perform learning, and for faculty to perform teaching. Enter AI; the perfect companion to the ‘performance of academia’.&lt;/p&gt;
&lt;p&gt;The faculty who work in good faith: the ‘good hearts in sick bodies’, are working hard to deal with the inundation of submissions that have been augmented or entirely produced by AI tooling. They wonder if there’s any way ‘back’ to a world where students are authentically engaged with material. The elephant in the staff room is that AI is just the whistleblower for the underlying and devastating reality that complex academic institutions removed authentic markers for student development from their feedback loops long ago. We’ve just now reached a stage where tech is available to ensure everyone can present as ‘up-to-speed’ instead of dropping out - which used to be conveniently leveraged to produce an appearance of quality and excellence as evidenced through completion rates and diverse student outcomes. Now of course, everyone gets a 2:1.&lt;/p&gt;
&lt;p&gt;Using this particular lens, AI is not a hurdle for higher education to jump, but an &lt;em&gt;assessment&lt;/em&gt; for it to improve in response to. Work that can be done without authentic, interpersonal, and embodied engagement with students is unlikely to be pro-human design in the first place! Systems that measure skill and competence without any relational or intersubjective artefacts at all are guaranteed to be atomised, alienating, and ultimately ineffective. The fact that robots can ace the courses from start to finish is the smoking gun.&lt;/p&gt;
&lt;p&gt;No such automation is available for mentor-mentee (&lt;em&gt;or master-apprentice&lt;/em&gt;) arrangements, where development is sewn into a lasting relationship that is reflected in work-objects that are all at once an opportunity, an assessment, and a reward; an embodied artefact of development and refinement over time. AI screams at us that we must urgently reform higher education (&lt;em&gt;starting at assessment processes and working backwards!&lt;/em&gt;) to identify relational consensus from collaborative groups, fuelled by intrinsic motivation as the desirable output of university - and that this output is the precious input for knowledge-production that makes the world a better, safer, civil place to be.&lt;/p&gt;
&lt;h2&gt;Writing hard or hardly writing?&lt;/h2&gt;
&lt;p&gt;Take another example - writing. AI can easily produce a prosaic estimation of any particular subject matter, and inflate it to fit a style of your choosing. These words are the well-dressed zombies of the human corpus of text-gone-by: conjured to walk and dutifully attend to our inboxes, but without any soul! AI prose is for applications of the written word that require nothing but the utilitarian conveyance of data from box to box.&lt;/p&gt;
&lt;p&gt;This is why it’s perfectly normal to use AI to write your emails, but utterly absurd to use it to write your reflective journal. Writing for outcomes is easily replaced by the bot, but writing for reflection, insight, and knowledge-production is not. So for most folks working day-to-day, the use of AI is just the technical actualisation of the scratch at the back of our brain when we write our press releases, our marketing copy, or our emoji-laden internal weekly-wins roundup newsletter for the team. We do of course know that buy-and-large the words we write at work are not a viable contribution to any great or meaningful human project - but instead the dutiful population of the working day with a performance of productivity; Our AI tools once again attend as exhibit-A in the trial that asks whether the system we’re producing these artefacts for is a humane one, or a machine that obviates human benefit for a performance of productivity.&lt;/p&gt;
&lt;h2&gt;Ask not what you can do with AI, but what AI can stop us doing at all.&lt;/h2&gt;
&lt;p&gt;Taken in aggregate, how much of our collective time do we waste on the production of artefacts that serve no purpose but to allude to the effectiveness of complex systems that don’t authentically serve any great human interest? How often do we reflect on our work in school or business, and realise that we are pretending to try while other pretend to listen?&lt;/p&gt;
&lt;p&gt;When we ask what the place of this AI tooling in academia, work, and life is, we must make sure we do so circling the right systems as our scope of assessment, and with the right outcomes in mind! It ought never be a reflection about whether or not we need to be using AI for these varied applications, but whether we should be undertaking such ventures as human beings in the first place! And no, this is not an AI booster blog that suggests we’re mere moments away from ‘automating’ these tasks and flying to the moon in our open-claw productivity spaceships… Instead, it’s an earnest suggestion that AI (&lt;em&gt;among it’s many fantastic uses&lt;/em&gt;) can be used as an effective mechanism for assessing the prevalence of anti-human design in complex systems.&lt;/p&gt;
&lt;p&gt;The system is what the system does. The machine that counts beans also functions to tell us we are destined for greater things than just counting beans, provided we are wise enough to see how easily beans can be counted by machines, kind enough to share the bean counter, and brave enough to decide if we want to count the beans at all.&lt;/p&gt;
]]></content>
  </entry>
  
  <entry>
    <title>Wage slaves: the Neo way</title>
    <link href="https://pistolas.co.uk/neo-robot/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/neo-robot/</id>
    <published>2025-11-02T00:00:00Z</published>
    <updated>2025-11-02T00:00:00Z</updated>
    
    <content type="html"><![CDATA[&lt;p&gt;The new neo robot premiered this week, showcasing the $500 a month subscription that sees customers gain access to a general purpose home robot. The thrust of the proposition is that the hardware is attached to a smart AI that can both carry out autonomous problem-solving tasks such as cleaning up the house, loading the dishwasher, and folding clothes. You have access to an app that allows you to schedule certain tasks, and the bot even has a ‘companion’ mode so that isolated but monied old folks have access to an embodied computation process to differentiate between cayenne and paprika.&lt;/p&gt;
&lt;p&gt;&lt;a href=&quot;https://youtu.be/LTYMWadOW7c?si=bRKx8d2-KVxdrqSs&quot;&gt;Watch the video&lt;/a&gt; for some context on the presentation of the thing - I think there’s something so interesting about the approach taken for this. It’s quite a nostalgia driven advert that positions the main speaker as explaining the robot to his grandma - effectively placing the robot in the established present as something to be caught up with, rather than a new proposition to be thought over or weighed up. It was an impressive pitch.&lt;/p&gt;
&lt;h2&gt;New hardware for an old enemy&lt;/h2&gt;
&lt;p&gt;The issue of course is that this is yet another massive smash and grab attempt on digital privacy in the global north, coupled with exploitation of the global south. First and foremost, all the popular media circulating of the bot shows the robot moving around home spaces inoffensively, inhuman in affect, but human in effect. The robot is however being teleoperated by someone using a virtual reality headset and controllers in almost all of the demonstrative media. While certain tasks (that are traditionally very easy for a human being to complete, but hard for a robot) can be given over to a problem solving AI, a lot of the activity that requires critical thinking and planning - such as cleaning an entire house, identifying and then taking out trash is actually intended to be handed over to a remote operator in India.&lt;/p&gt;
&lt;h2&gt;The clear and obvious definition of a fixed rate for unlimited human labour.&lt;/h2&gt;
&lt;p&gt;This means you have access to unlimited human labour for $500 a month. This economy is deliverable only by providing a physical actuator for human labour in another country. Of course it would be illegal to ship over these workers and pay them their local domestic salaries in the global north to complete unlimited labour in the home day and night. Here then is one of the brazen value propositions of the Neo bot: We will make your domestic workers as fungible in form as you treat them to be in your mind’s eye. Interchangeable, dismissible, and inhuman. Abstraction of relational and eye-to-eye accountability is a key part of economies of scale in complex systems reliant on human labour, and this is no different.&lt;/p&gt;
&lt;p&gt;In addition to presenting a complex challenge to labour law, this open approach is an evolution on the history of AI-centred organisations; historically, we have many examples of modern ‘mechanical Turks’ - that is, dressing up human labour as an AI application (Like Amazon’s ‘just walk out’ store which was actually staffed remotely by hundreds of Indian workers). Neo does away with the cloak and dagger, the system is open and indeed proud of its value proposition to the customer. Having produced a mechanism where instead of utilising the sustained efforts of one person, you instead tap into a managed pool of labour that can access the same physical actuation in your living room - Neo intends to sell wage slavery by the back door, abstracting your enjoyment of domestic bliss from the fleet of workers who provide it to you.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Generally&lt;/strong&gt;, when people become wealthy enough that they own a space they cannot possibly maintain by themselves, they employ domestic workers. This is a complicated affair because domestic workers are people with souls and material needs, and likely exist in material conditions very different to that of the person wealthy enough to pay someone to tend to their home. This means that the wealthy are inherently suspicious and distrusting of their domestic workers, they think they will steal from them or live in resentment of their employer. The wealthy have to witness the worker, and worse than this - they have to witness the worker witness them. Neo is a sanitising solution; finally, we have access to the effects of domestic workers, without the human inputs or considerations.&lt;/p&gt;
&lt;p&gt;Best of all for the rich and suspicious: the sense organs afforded to the domestic worker to carry out their daily labour is a surveillance mechanism of their effectiveness and honesty. No more discretion, no more relationship, no more trust.&lt;/p&gt;
&lt;h2&gt;Data nightmare on legs&lt;/h2&gt;
&lt;p&gt;The box-tee baseball-cap surfer-dude tech-bro who runs the wage labourer laundering company identifies a gender essentialising data model he calls ‘big sister’, which is where you admit that you’re a ‘big brother’ company but promise that you’re using the data to do good. It seems like the company has made it to a working teleoperational robot that can do a few things well, and aims to capture data from it’s first commercial user base to train it’s model over time.&lt;/p&gt;
&lt;p&gt;In addition to training off the manual labourers who perform domestic work via teleoperation (and building an incredibly valuable model most likely), this also means that your Neo bot is a data ingestion point, sucking up thousands of images of your home each day. The mind boggles at a future where law enforcement can get a floor plan of your house, or be let in at 2am by issuing a warrant to your domestic robot.&lt;/p&gt;
&lt;h2&gt;Conclusion&lt;/h2&gt;
&lt;p&gt;AI has a central thrust of trading obscene amounts of compute that have clear and irreversible impacts on the climate - disproportionately effecting the global south. We have a rich and long standing tradition of exploiting the economic positioning of the people who live in poorer countries, which was leveraged as society digitalised to conduct labour at a distance - Indian call centres became a meme out of this frequency.&lt;/p&gt;
&lt;p&gt;The last 10 years of efforts in the AI space have built a vehicle for ramping up efforts to sell consumer conveniences in the global north that are thinly veiled and mechanically abstracted outsourcing of thinking and doing to people in other countries - largely the global south. All of this incarnate and refreshed in Neo: The robot that proves that the venn diagram of people who like to brush wage slavery under the rug and think that doing the dishes is below them is a circle.&lt;/p&gt;
]]></content>
  </entry>
  
  <entry>
    <title>The new apple advert wants you to stop thinking about other people</title>
    <link href="https://pistolas.co.uk/apple-ad/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/apple-ad/</id>
    <published>2025-03-04T00:00:00Z</published>
    <updated>2025-03-04T00:00:00Z</updated>
    <summary>Septembers ad campaign that saw a formal invitation to truly stop thinking about the people we love.</summary>
    <content type="html"><![CDATA[&lt;h3&gt;This article is six months old&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;I wrote this in September 2024, and simply never published it. It refers to an Apple ad campaign from September 2024. The focus still holds up as we continue to be served AI-driven features, so I decided to share it.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h1&gt;Introduction&lt;/h1&gt;
&lt;p&gt;The shiny new Apple adverts work with Bella Ramsey to promote the ‘just-in-time’ wonder features made possible by ‘Apple Intelligence’, whereby the iPhone can utilise access to Ramsey’s data (her calendar, her email inbox, and her photo library) to avoid the realisation of an awkward or imperfect social moment. The best thing to do would be to go and watch the three examples before reading this, so here’s a link with each summary:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Bella is at a party and sees someone across the room who she met at a meeting a few weeks ago but forgets his name; luckily, Bella can ask siri who she “went to that meeting with a couple of weeks ago” at a certain cafe. Siri reminds Bella who that meeting was scheduled with, and can greet Zach by his name as he sidles over.&lt;/li&gt;
&lt;li&gt;Bella is lunching with an agent who asks what she thought of the pitch she emailed over. Bella hasn’t read it and checks her phone, using the new “Summarise with AI” feature to read off a summary of the email and improvise a reaction. The agent reacts positively to this.&lt;/li&gt;
&lt;li&gt;Bella is outside with her family; her mother, father, and younger sister (Kristy) stand surrounding the kid’s fresh grave for her pet fish. The father struggles awkwardly to improvise a eulogy. Luckily, Bella can ask her AI-assisted photo album to produce a custom photo album to music - using the prompt “Kristy with her fish, sad vibes”.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Besides the fact that Bella is rudely checking her phone in these interactions (or ducking behind a wall to avoid the gaze of Zach) - somewhat undermining the authenticity the ad is trying to sell us, the ad shines a light on the soft power campaigns I think we’ll see more of as AI continues to try and brand itself as a consumer solution, not a data nightmare.&lt;/p&gt;
&lt;h1&gt;The Apple Way&lt;/h1&gt;
&lt;p&gt;The Apple ecosystem is infamous for producing consumer electronics that work together with a continuity and convenience that improves in quality the more of their devices you add; similarly, Apple devices don’t play well with outsider devices, and it can be quite frustrating to use Android or Windows devices once you have become accustomed to “The Apple Way” of doing things. This design language teaches users that Apple sorts out the technicalities of computation, and that you get to experience the benefits of technology without any of the mechanics. This new advert takes these benefits into interpersonal interactions, and should be held in the same light as we examine what apple are trying to convince us is the line between the benefits of the social world and the cumbersome mechanics we must endure only until they can be automated.&lt;/p&gt;
&lt;p&gt;All three adverts centre on the idea that there was a failing of some sort that has led to this moment, whether it’s the normal experience of forgetting someone’s name, or the absenteeism of modern tech-bro fathers everywhere in not paying attention to their children. Apple make clear that these moments are undesirable and ought to be done away with if at all possible. The premise is of course disagreeable; it is normal to find yourself forgetting someones name, it is equally as normal to be unprepared for a meeting. It is (sadly) normal for a parent to forget the things that their child find most interesting and engaging. With exception to the latter example, it is well understood that you just muddle through these moments best that you can, confronting the mild and impermanent anxiety that comes with this. You come out the other end a little sheepish, but otherwise unharmed. If you find those moments truly difficult, you pursue some behavioural or communicative improvement or strategy.&lt;/p&gt;
&lt;p&gt;Of course the advert shows the opposite of this, that the need - or indeed the opportunity, for reflection is nerve-wracking, and is about to thankfully be made irrelevant. This AI feature is an augmentation of what we already behaviourally use smartphones for: quelling anxiety. Dead space and time is filled with scrolling of social media, you are never left alone or unoccupied, the thoughts or feelings of where you are and how you regulate that can be numbed immediately. There is no longer any need to be unstimulated. These social faux pas were a holdout against this flattened and flattening state of affairs - in the real world you can be pulled back into yourself and forced to confront your own understanding of reality when someone is brushing up against it in a way that isn’t immediately compartpentaliseable. The ad is communicating quite clearly that you can avoid these impure moments of a real and proper life from occurring if you take a bite of the apple, and they promise to make those twinging cringing moments melt away. In the case of the family - we are told that we can simply outsource these difficult moments to Apple (What are the priorities of this family that the nurturing of Kristy in a moment of sadness and learning ought to be outsourced to a consumer electronic? What a dismal and undesirable way of life; at the beginning of the advert Kristy had an inattentive father, by the end of it we had a demonstration that she had an inattentive family).&lt;/p&gt;
&lt;p&gt;What then is the price of these features? Apple need to be able to make a sufficient enough digital twin of you that they can use it to feed actionable information back to you. This demands data. Data for the “you-machine”.&lt;/p&gt;
&lt;h1&gt;Here’s the deal&lt;/h1&gt;
&lt;p&gt;Apple will facilitate this data-driven avoidant omnipotence if you ensure that you use an apple calendar, an apple mailbox, an apple phone, and apple storage. If you buy-in totally, then Apple can do the thinking and processing for you. This is very similar to the aforementioned design language of apple, save for one key difference: The scope. As mentioned, traditionally apple was focused on building an ecosystem of connected devices and services that don’t meaningfully interoperate with outsiders - shunning or disincentivising devices outside of their private ecosystem. This new AI approach ‘innovates’ on this and asks you to buy-in totally to an apple facilitated ‘lifestyle system’, shunning non-apple means of planning, chatting, photographing, calendaring, and beyond. If you decide to meet a friend next Tuesday, you ought to pop it into your iCalendar using a descriptive title (one that includes your friend’s name to tie them to the event), and don’t forget to include the name of the place you’re meeting. This voluntary reporting gives a copy of your plans to your iPhone so that is can use it to answer future queries and questions you may have.&lt;/p&gt;
&lt;p&gt;This is the opt-in that gives the phone sufficient data to produce and maintain a digital twin of you - one that contains live access to your plans, events, geo-tagged photos, notes, messages, etc. It is this that is probed in those moments to provide an unerring account of everything that you’ve ever done, everyone that you’ve ever seen, every message ever read. Where possible, these applications and services must all be Apple’s, the data must belong to them.&lt;/p&gt;
&lt;p&gt;You’ll be able to make better decisions about birthday presents for friends if you ensure the device has access to your entire conversation history, so you better make sure it’s on iMessage and not on Signal or WhatsApp. And don’t forget to sign in to your emails with Mail on Mac OS to ensure you never need to read another email properly again. Apple’s native journalling app can summarise your mood last week far more concisely than if you needed to leaf through your physical diary, so perhaps just commit to using that.&lt;/p&gt;
&lt;p&gt;This is the trade: Give them everything about you, so you don’t have to feel anxious about being yourself anymore.&lt;/p&gt;
&lt;h1&gt;What future is this advertising?&lt;/h1&gt;
&lt;p&gt;I was reminded of an awesome article written by Sam Kriss back in April - where he reflects on a month spent without his phone. He observed what came back to him when he stopped relying on his phone so much - the different shapes of the nerves and the thoughts that bubble up when you don’t have a constant reality escape hatch in the form of a connected device, and how this felt fruitful and whole for him:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;A phone is a device for &lt;em&gt;muting the anxieties proper to being alive&lt;/em&gt;. This is what all its functions and features ultimately achieve: cameras deliver you from time, GPS abstracts you out of space, and an all-consuming screen that keeps you a constant safe distance from yourself. If there’s something you’re worried or upset about, you can simply hide behind your phone and it will all go away. One third of adults say they’re on their phones almost constantly. Their entire waking lives are spent &lt;em&gt;filling time&lt;/em&gt;, plastering over the gaps, burning up one day after another, waiting for something to happen, and it never does.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The full piece is well worth the read. This AI facilitated socialising is an extension of this &lt;em&gt;muting of the anxieties proper to being alive&lt;/em&gt; - AI intelligence will deliver us from interrelation with others as and when we see fit - expand that escape hatch to include our immediate interactions with others; I wonder about a future where this technology is completely metabolised into common use and what this means for us. Is it going to become rude or taboo when those of us who don’t adopt the technology continue to make human errors? Are 16 year olds going to do ‘networking prep’ on their mobile phones before going to parties, making sure that they’ve got suitable talking points and social contexts set straight with their device before rocking up to a house party? Exploring fully the impact this will have in peer groups, the problem scales quite quickly.&lt;/p&gt;
&lt;p&gt;Building on this, the system invites you to turn your friends into data-subjects; Zach doesn’t know that his whereabouts are being processed by some random device from inferred metadata, nor does young Kristy arguably even have the ability to offer informed consent to allow her likeness to be processed and collated by an AI - because she is a child. We’re being invited to literally capture more of our friends and relatives, to build a machine that ensures we know them and ourselves less and less. AI providers continue to cast these ethical questions by the wayside in an attempt to throw us irreversibly into a post-privacy world.&lt;/p&gt;
&lt;h1&gt;Zooming out&lt;/h1&gt;
&lt;p&gt;There’s a lot of current media focus on the ecological and social impact that AI is having on the material world around us - ranging from the overuse of purified water to keep data centres cool, to the AI-enabled production and distribution of synthetic child sexual abuse material. AI also needs a lot of data to chew on to work effectively, and this data comes from our organic and semi-voluntary use of platforms that don’t give us a functional means to opt-out. This is combined with the flurry of boosterism from AI magnates such as Sam Altman who suggest that all we need to do to solve these problems is offer up more computational power, energy, and data until the AI itself proffers a solution. It’s a brazen strategy that asks us to step deeper into the flames to find the water - solve the data problem by giving it more data, solve the climate crisis by burning more of our fuel.&lt;/p&gt;
&lt;p&gt;Apples new ad is an early example of what we’re bound to see more of; We’ll be offered consumer conveniences at cost to our data sovereignty, privacy, and authenticity of self. Apple are asking if we can be bought off while the VC-funded sprint to end the world the fastest carries on unchecked and unregulated.&lt;/p&gt;
]]></content>
  </entry>
  
  <entry>
    <title>Anxiety alleviation rituals are not knowledge production</title>
    <link href="https://pistolas.co.uk/how-to-know/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/how-to-know/</id>
    <published>2024-11-04T00:00:00Z</published>
    <updated>2024-11-04T00:00:00Z</updated>
    <summary>Reflections on the use of the term research and big data&#39;s attempts to innovate away the integrity of critical thinking</summary>
    <content type="html"><![CDATA[&lt;p&gt;Short reflection after reading: &lt;a href=&quot;https://www.theguardian.com/lifeandstyle/2024/oct/27/for-my-son-ive-ceased-to-be-the-font-of-all-useful-knowledge&quot;&gt;For my son, I’ve ceased to be the font of all useful knowledge&lt;/a&gt; from the Grauniad.&lt;/p&gt;
&lt;h1&gt;Do some research&lt;/h1&gt;
&lt;p&gt;The term ‘research’ is always misused as to refer to ‘the collection of information to make or inform a decision or action’. When I am asked to put together a business case for a new tool at work, or decide where we ought to go for dinner next Tuesday, I suggest that “I do some research on the subject”. Of course this is not what research is; Research is the systematised work that aims to contribute to &lt;strong&gt;the&lt;/strong&gt; stock of human knowledge.&lt;/p&gt;
&lt;p&gt;When we say “&lt;strong&gt;The&lt;/strong&gt; stock of human knowledge”, we certainly don’t mean the bits of it you or I have experiential access to, we mean holistically and totally, totting up the knowledge that anybody and everybody has access too. If somebody knows about it already, then you are not doing research to find that information, you are just retrieving that information and making it known to yourself - further to the ‘research’ that produced it as knowledge.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://pistolas.co.uk/assets/images/OyxEqDR.webp&quot; alt=&quot;A flow chart showing the research object as a process of knowledge production, and the accessing of knowledge as information&quot; /&gt;&lt;/p&gt;
&lt;p&gt;Research is the process for a system of knowledge production, not the mechanism by which we make knowledge that exists available to ourselves. The use of the term research interchangeably with the action of information retrieval is symptomatic of a society that has a mostly individualised and individualising relationship with their information systems, where you equate what is known and knowable with what you (as an individual) know and can know.&lt;/p&gt;
&lt;h1&gt;Digitalising how to know, providing what to know&lt;/h1&gt;
&lt;p&gt;The constant mix up on the term research is an insight into the lack of clear demarcation of &lt;strong&gt;research&lt;/strong&gt; and &lt;strong&gt;information-retrieval&lt;/strong&gt; in what we could consider the ‘common sense’. In truth, the processes of research and the scientific method are our socio-cultural machine for ‘How to know’ something, with the resultant information produced being ‘What to know’.&lt;/p&gt;
&lt;p&gt;Consumer electronics such as our kitchen-listener friend Alexa invite us to hand over our ownership of access to that mechanism of ‘How to know’ something to the Amazon Web Service; “Doing your own research” is now the act of submitting an information request to one of any number of monopolists who aggregate and present data (&lt;em&gt;selectively and in order of what is best for their advertising partners&lt;/em&gt;), and presenting that process as the effective mechanism for ‘How to know something’.&lt;/p&gt;
&lt;p&gt;Further still the AI ‘revolution’ changes the landscape for users, with AI-powered summaries now ingesting multiple sources to produce an approximate summary of results (Google’s new Generative AI search results and Notebook LLM as relevant examples). What remnants there may have been of the behaviour to investigate and fully explore various sites or perspectives (Which is still not a true research method, but is designed to approximate one) is disincentivised - with your review of resources now automated, you get the information you need and a list of “sources” to the right. ‘How to know’ is obsolete, all praise ‘What to know’.&lt;/p&gt;
&lt;p&gt;&lt;img src=&quot;https://pistolas.co.uk/assets/images/IDUTypP.webp&quot; alt=&quot;A google result for &amp;quot;What is critical thinking&amp;quot;.&quot; /&gt;&lt;/p&gt;
&lt;p&gt;The true mechanism of critical engagement and integrity of due process melt away into an anxious state of being for any participant of this new way of being and knowing; where knowing is not a habit of reflection or commitment to a process, but instead the perceived ability to - at any time, dip into the resource bank and access the offered information. Equally, to not know - is now simply to not be able to check your understanding of things against the relevant tool - a phone, a smart speaker, a search engine. Behaviourally, this leaves us with access to what we want to know, but no sense of authorship over the process that produced our understanding. Our relationship to this type of knowledge is an anxious one, where we defer from our faculties for learning and understanding, and foster a faith-based relationship to information.&lt;/p&gt;
&lt;p&gt;These are the digital information systems that people will use by default if subjected to broken-by-design devices that work to commodify knowledge and it’s access as data. It is profitable for providers to produce a relationship to knowledge that is owned and tended to by the devices they market as the true mechanism for how to know something.&lt;/p&gt;
&lt;p&gt;The jettison of the need to have an explanatory, critical relationship with information is underway. It is systematised as normal and efficient to prioritise finding out “What you need to know”, with any process obsolete and time-consuming. Is it surprising that in this new world, a curious six year old may decide that it’s easier to skip the back-and-forth questioning with Dad, and simply get what he “needs to know” from the apparently superior source and digital childminder - &lt;a href=&quot;http://amazon.com/&quot;&gt;Amazon.com&lt;/a&gt; Inc?&lt;/p&gt;
]]></content>
  </entry>
  
</feed>
