<?xml version="1.0" encoding="UTF-8"?>
<feed xmlns="http://www.w3.org/2005/Atom">
  <title>privacy — Savva Pistolas</title>
  <subtitle>Writing about AI, alignment, systems thinking, cybersecurity, futurism, privacy, and more.</subtitle>
  <link href="https://pistolas.co.uk/feeds/tags/privacy/feed.xml" rel="self" type="application/atom+xml"/>
  <link href="https://pistolas.co.uk/tag/privacy/" rel="alternate" type="text/html"/>
  <id>https://pistolas.co.uk/tag/privacy/</id>
  
  
  <updated>2026-04-05T10:09:26Z</updated>
  
  <author>
    <name>Savva Pistolas</name>
    <email>savva@pistolas.co.uk</email>
  </author>
  
  <entry>
    <title>Agency claims from subjects in closed systems - Legalposting on social media</title>
    <link href="https://pistolas.co.uk/agency-claims-in-closed-systems/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/agency-claims-in-closed-systems/</id>
    <published>2025-09-29T00:00:00Z</published>
    <updated>2025-09-29T00:00:00Z</updated>
    <summary>What cringe legalese facebook posts tell us about our relationship to exploitative data monopolies</summary>
    <content type="html"><![CDATA[&lt;h1&gt;Context&lt;/h1&gt;
&lt;p&gt;I wrote this about two years ago, and have been cleaning up my Obsidian vault in anticipation of a new academic venture that will require a lot of note taking… I came across this old self-blog and thought I’d host it here.&lt;/p&gt;
&lt;h1&gt;Stimulus&lt;/h1&gt;
&lt;p&gt;&lt;s&gt;I was recently delivering a security awareness training session for a client&lt;/s&gt; &lt;em&gt;I was delivering a Security Awareness Training for a client in 2023&lt;/em&gt;, and was hit with a question that took me back to about 2015; &amp;quot;Sometimes you see those facebook posts that declare that you don’t give them permission to use your data and things like that - does that hold any water?&amp;quot;It blasted me back to when I would sit at my laptop as a 15 year old facebook user and see these armchair solicitors  - in very good faith, copy and paste viral statements regarding the new exploitation of your information by facebook or twitter, and attempt to opt-out of it with a legalese-ridden post.&lt;/p&gt;
&lt;img src=&quot;https://pistolas.co.uk/assets/images/dLyEQ4N.webp&quot; width=&quot;300&quot; /&gt;
&lt;p&gt;Other examples I could remember were just simple acts of attempting to withdraw consent to data capture:&lt;/p&gt;
&lt;img src=&quot;https://pistolas.co.uk/assets/images/Hgraea4.webp&quot; width=&quot;300&quot; /&gt;
&lt;p&gt;My answer to this question at the training was that there is no granularity in your consent when you use digital platforms that provide social media services, and if you have signed the EULA (Which is going to have been a requirement for you to access the platform), then the platform owner and team are the only ones who have a say in the exploitation of the information you provide them.&lt;/p&gt;
&lt;p&gt;Of course there are settings that can be configured to alter or modify your privacy settings, but these almost entirely relate to how your data can be used or viewed by other third parties - the owner-operator of the platform has free reign with your data for the most part. There are currently some options to supposedly opt-out of the use of your data in training the AI models of certain platforms. When looking into this, I found that there was a new example of the legal-post phenomenon relating specifically to revoking access to your data for AI training:&lt;/p&gt;
&lt;img src=&quot;https://pistolas.co.uk/assets/images/xyiKpCC.webp&quot; width=&quot;300&quot; /&gt;
&lt;h1&gt;Reflection&lt;/h1&gt;
&lt;p&gt;Users who feel that there is any chance of being able to directly modify the data relationship the given platform has to them by posting something are declaring something quite important; They declare that they believe they are able to use the bounded system provided by the platform to modify or escape the system itself. This signals a belief in the agency the user thinks they have to express themselves on the platform, and that they think this digitally enabled speech is equivalent to a public announcement or legal declaration. Quite confusingly, it really rather does declare that the user ought not to have their data exploited by the platform - because they can’t have made an informed decision to use the platform in the first place if they expect such a post to have any sort of impact.&lt;/p&gt;
&lt;p&gt;It also exposes that for the most part - we (as in people) still don’t understand the nuts and bolts of social media as a common-sense, and tend to treat it like a public commons or political sphere. Facebook is intuited by many as a digitisation of your persona for use with your professional and personal peer group. It is used to plan events, buy and sell, and make groups that mirror or imitate real world counterparts of such processes. These processes - in the real world, are defined by people’s participation, not by the platform of facilitation. We assume too readily that their digital and artificial counterparts provide the same core ‘features’ or ‘freedoms’. They do not. They are designed as a data service, and have no ‘social mandate’ outside of the fact that they’re being used.&lt;/p&gt;
&lt;p&gt;The perceived experience from service users of social media that you are acting as your authentic ‘digital self’ on these platforms instead of accessing a locked-down and for-profit business platform shows how much educational work is needed to correct the ongoing social-media cultural campaign to convince users that they are “expressing themselves authentically” on their platform.&lt;/p&gt;
&lt;p&gt;It’s tempting to quietly categorise the type of user who would post a “just to be safe” pseudo-legal notice as an older user, less understanding of technology and it’s mechanics - but of course this would be inaccurate. Younger users who have grown up in a post-explanatory consumer electronic landscape are using social media as a primary communication platform in their peer groups. Instagram and Snapchat are generally held in mind as ‘identity communication’ toolkits. While the campaigns to inform users of the impact of social media tends towards the social impacts, it may also be worth starting to look at educating users on how harmful it is to spend a majority of your social time on for-profit, data-driven platforms that seek to produce a service user that entirely understands their political or personal identity as a set of declarations, hosted (sponsored/mandated/permitted) by a tech company, with trips to the TikTok shop to make purchases acting as proof-of-identity.&lt;/p&gt;
&lt;p&gt;Average users are conditioned from the outset to build or produce a digital identity that relies on declaration - declarations of hobbies, interests, and ‘hot takes’. It may well be this hyper focus on the individuation of politicking in digital spaces that put such a focus on identity politics over materialism amongst onliners when they get into the real world. The whole shtick of social media is that you have been given permission to identify and express yourself on the platform, and need only declare who you are (the louder and more frequently the better) in order to successfully ‘be yourself’. The platform becomes the observer and mediator of identity claims in this case. Is it any wonder that this class of digital-first service users get a nasty shock when they discover the gilded cage they’ve found themselves in can’t be opened with the very same positivist identity claims the platform tells them they’re made up of?&lt;/p&gt;
]]></content>
  </entry>
  
  <entry>
    <title>The new apple advert wants you to stop thinking about other people</title>
    <link href="https://pistolas.co.uk/apple-ad/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/apple-ad/</id>
    <published>2025-03-04T00:00:00Z</published>
    <updated>2025-03-04T00:00:00Z</updated>
    <summary>Septembers ad campaign that saw a formal invitation to truly stop thinking about the people we love.</summary>
    <content type="html"><![CDATA[&lt;h3&gt;This article is six months old&lt;/h3&gt;
&lt;blockquote&gt;
&lt;p&gt;I wrote this in September 2024, and simply never published it. It refers to an Apple ad campaign from September 2024. The focus still holds up as we continue to be served AI-driven features, so I decided to share it.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h1&gt;Introduction&lt;/h1&gt;
&lt;p&gt;The shiny new Apple adverts work with Bella Ramsey to promote the ‘just-in-time’ wonder features made possible by ‘Apple Intelligence’, whereby the iPhone can utilise access to Ramsey’s data (her calendar, her email inbox, and her photo library) to avoid the realisation of an awkward or imperfect social moment. The best thing to do would be to go and watch the three examples before reading this, so here’s a link with each summary:&lt;/p&gt;
&lt;ol&gt;
&lt;li&gt;Bella is at a party and sees someone across the room who she met at a meeting a few weeks ago but forgets his name; luckily, Bella can ask siri who she “went to that meeting with a couple of weeks ago” at a certain cafe. Siri reminds Bella who that meeting was scheduled with, and can greet Zach by his name as he sidles over.&lt;/li&gt;
&lt;li&gt;Bella is lunching with an agent who asks what she thought of the pitch she emailed over. Bella hasn’t read it and checks her phone, using the new “Summarise with AI” feature to read off a summary of the email and improvise a reaction. The agent reacts positively to this.&lt;/li&gt;
&lt;li&gt;Bella is outside with her family; her mother, father, and younger sister (Kristy) stand surrounding the kid’s fresh grave for her pet fish. The father struggles awkwardly to improvise a eulogy. Luckily, Bella can ask her AI-assisted photo album to produce a custom photo album to music - using the prompt “Kristy with her fish, sad vibes”.&lt;/li&gt;
&lt;/ol&gt;
&lt;p&gt;Besides the fact that Bella is rudely checking her phone in these interactions (or ducking behind a wall to avoid the gaze of Zach) - somewhat undermining the authenticity the ad is trying to sell us, the ad shines a light on the soft power campaigns I think we’ll see more of as AI continues to try and brand itself as a consumer solution, not a data nightmare.&lt;/p&gt;
&lt;h1&gt;The Apple Way&lt;/h1&gt;
&lt;p&gt;The Apple ecosystem is infamous for producing consumer electronics that work together with a continuity and convenience that improves in quality the more of their devices you add; similarly, Apple devices don’t play well with outsider devices, and it can be quite frustrating to use Android or Windows devices once you have become accustomed to “The Apple Way” of doing things. This design language teaches users that Apple sorts out the technicalities of computation, and that you get to experience the benefits of technology without any of the mechanics. This new advert takes these benefits into interpersonal interactions, and should be held in the same light as we examine what apple are trying to convince us is the line between the benefits of the social world and the cumbersome mechanics we must endure only until they can be automated.&lt;/p&gt;
&lt;p&gt;All three adverts centre on the idea that there was a failing of some sort that has led to this moment, whether it’s the normal experience of forgetting someone’s name, or the absenteeism of modern tech-bro fathers everywhere in not paying attention to their children. Apple make clear that these moments are undesirable and ought to be done away with if at all possible. The premise is of course disagreeable; it is normal to find yourself forgetting someones name, it is equally as normal to be unprepared for a meeting. It is (sadly) normal for a parent to forget the things that their child find most interesting and engaging. With exception to the latter example, it is well understood that you just muddle through these moments best that you can, confronting the mild and impermanent anxiety that comes with this. You come out the other end a little sheepish, but otherwise unharmed. If you find those moments truly difficult, you pursue some behavioural or communicative improvement or strategy.&lt;/p&gt;
&lt;p&gt;Of course the advert shows the opposite of this, that the need - or indeed the opportunity, for reflection is nerve-wracking, and is about to thankfully be made irrelevant. This AI feature is an augmentation of what we already behaviourally use smartphones for: quelling anxiety. Dead space and time is filled with scrolling of social media, you are never left alone or unoccupied, the thoughts or feelings of where you are and how you regulate that can be numbed immediately. There is no longer any need to be unstimulated. These social faux pas were a holdout against this flattened and flattening state of affairs - in the real world you can be pulled back into yourself and forced to confront your own understanding of reality when someone is brushing up against it in a way that isn’t immediately compartpentaliseable. The ad is communicating quite clearly that you can avoid these impure moments of a real and proper life from occurring if you take a bite of the apple, and they promise to make those twinging cringing moments melt away. In the case of the family - we are told that we can simply outsource these difficult moments to Apple (What are the priorities of this family that the nurturing of Kristy in a moment of sadness and learning ought to be outsourced to a consumer electronic? What a dismal and undesirable way of life; at the beginning of the advert Kristy had an inattentive father, by the end of it we had a demonstration that she had an inattentive family).&lt;/p&gt;
&lt;p&gt;What then is the price of these features? Apple need to be able to make a sufficient enough digital twin of you that they can use it to feed actionable information back to you. This demands data. Data for the “you-machine”.&lt;/p&gt;
&lt;h1&gt;Here’s the deal&lt;/h1&gt;
&lt;p&gt;Apple will facilitate this data-driven avoidant omnipotence if you ensure that you use an apple calendar, an apple mailbox, an apple phone, and apple storage. If you buy-in totally, then Apple can do the thinking and processing for you. This is very similar to the aforementioned design language of apple, save for one key difference: The scope. As mentioned, traditionally apple was focused on building an ecosystem of connected devices and services that don’t meaningfully interoperate with outsiders - shunning or disincentivising devices outside of their private ecosystem. This new AI approach ‘innovates’ on this and asks you to buy-in totally to an apple facilitated ‘lifestyle system’, shunning non-apple means of planning, chatting, photographing, calendaring, and beyond. If you decide to meet a friend next Tuesday, you ought to pop it into your iCalendar using a descriptive title (one that includes your friend’s name to tie them to the event), and don’t forget to include the name of the place you’re meeting. This voluntary reporting gives a copy of your plans to your iPhone so that is can use it to answer future queries and questions you may have.&lt;/p&gt;
&lt;p&gt;This is the opt-in that gives the phone sufficient data to produce and maintain a digital twin of you - one that contains live access to your plans, events, geo-tagged photos, notes, messages, etc. It is this that is probed in those moments to provide an unerring account of everything that you’ve ever done, everyone that you’ve ever seen, every message ever read. Where possible, these applications and services must all be Apple’s, the data must belong to them.&lt;/p&gt;
&lt;p&gt;You’ll be able to make better decisions about birthday presents for friends if you ensure the device has access to your entire conversation history, so you better make sure it’s on iMessage and not on Signal or WhatsApp. And don’t forget to sign in to your emails with Mail on Mac OS to ensure you never need to read another email properly again. Apple’s native journalling app can summarise your mood last week far more concisely than if you needed to leaf through your physical diary, so perhaps just commit to using that.&lt;/p&gt;
&lt;p&gt;This is the trade: Give them everything about you, so you don’t have to feel anxious about being yourself anymore.&lt;/p&gt;
&lt;h1&gt;What future is this advertising?&lt;/h1&gt;
&lt;p&gt;I was reminded of an awesome article written by Sam Kriss back in April - where he reflects on a month spent without his phone. He observed what came back to him when he stopped relying on his phone so much - the different shapes of the nerves and the thoughts that bubble up when you don’t have a constant reality escape hatch in the form of a connected device, and how this felt fruitful and whole for him:&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;A phone is a device for &lt;em&gt;muting the anxieties proper to being alive&lt;/em&gt;. This is what all its functions and features ultimately achieve: cameras deliver you from time, GPS abstracts you out of space, and an all-consuming screen that keeps you a constant safe distance from yourself. If there’s something you’re worried or upset about, you can simply hide behind your phone and it will all go away. One third of adults say they’re on their phones almost constantly. Their entire waking lives are spent &lt;em&gt;filling time&lt;/em&gt;, plastering over the gaps, burning up one day after another, waiting for something to happen, and it never does.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;The full piece is well worth the read. This AI facilitated socialising is an extension of this &lt;em&gt;muting of the anxieties proper to being alive&lt;/em&gt; - AI intelligence will deliver us from interrelation with others as and when we see fit - expand that escape hatch to include our immediate interactions with others; I wonder about a future where this technology is completely metabolised into common use and what this means for us. Is it going to become rude or taboo when those of us who don’t adopt the technology continue to make human errors? Are 16 year olds going to do ‘networking prep’ on their mobile phones before going to parties, making sure that they’ve got suitable talking points and social contexts set straight with their device before rocking up to a house party? Exploring fully the impact this will have in peer groups, the problem scales quite quickly.&lt;/p&gt;
&lt;p&gt;Building on this, the system invites you to turn your friends into data-subjects; Zach doesn’t know that his whereabouts are being processed by some random device from inferred metadata, nor does young Kristy arguably even have the ability to offer informed consent to allow her likeness to be processed and collated by an AI - because she is a child. We’re being invited to literally capture more of our friends and relatives, to build a machine that ensures we know them and ourselves less and less. AI providers continue to cast these ethical questions by the wayside in an attempt to throw us irreversibly into a post-privacy world.&lt;/p&gt;
&lt;h1&gt;Zooming out&lt;/h1&gt;
&lt;p&gt;There’s a lot of current media focus on the ecological and social impact that AI is having on the material world around us - ranging from the overuse of purified water to keep data centres cool, to the AI-enabled production and distribution of synthetic child sexual abuse material. AI also needs a lot of data to chew on to work effectively, and this data comes from our organic and semi-voluntary use of platforms that don’t give us a functional means to opt-out. This is combined with the flurry of boosterism from AI magnates such as Sam Altman who suggest that all we need to do to solve these problems is offer up more computational power, energy, and data until the AI itself proffers a solution. It’s a brazen strategy that asks us to step deeper into the flames to find the water - solve the data problem by giving it more data, solve the climate crisis by burning more of our fuel.&lt;/p&gt;
&lt;p&gt;Apples new ad is an early example of what we’re bound to see more of; We’ll be offered consumer conveniences at cost to our data sovereignty, privacy, and authenticity of self. Apple are asking if we can be bought off while the VC-funded sprint to end the world the fastest carries on unchecked and unregulated.&lt;/p&gt;
]]></content>
  </entry>
  
  <entry>
    <title>Pointing at the mushrooms - Identifying our own digital colonisation</title>
    <link href="https://pistolas.co.uk/mushroom/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/mushroom/</id>
    <published>2025-03-04T00:00:00Z</published>
    <updated>2025-03-04T00:00:00Z</updated>
    <summary>A chat about digital advertising and fungus</summary>
    <content type="html"><![CDATA[&lt;p&gt;Far and away the most popular misconception I have heard about the way digital advertising works is that our phones listen to us and create tailored advertising based on this. It is a very accessible point of conversation for people who maybe don’t know the workings of the mechanisms that make their technology work but can use pattern recognition to identify their interests being presented back to them on their devices.&lt;/p&gt;
&lt;p&gt;The classic story is always that someone was having a conversation with a friend (That was just incredibly specific and unrelated to what we would normally talk about) and then shortly after that the person received adverts that were tailored around the subject of conversation so specifically that the only logical conclusion is that the phone was listening to their conversation.&lt;/p&gt;
&lt;p&gt;While social media is objectively consuming your usage data to produce tailored advertisements - up to and including your message content, what you’re viewing, how you scroll, and who you’re talking to, it is not recording your voice in the background.&lt;/p&gt;
&lt;p&gt;Aside from being an incredibly resource-intensive and impractical way to gather data that would be mostly pocket-noise, any recordings taken would be an observable action taken by software on your phone. Applications like facebook and instagram are constantly being researched and probed by security conscious researchers and hobbyists who are working to identify new ways that the social media giants are recording data on us. General audio recordings on your phone that capture conversation are not on these lists.&lt;/p&gt;
&lt;p&gt;What then is happening? The answer is very much explainable by taking a left turn to talk about mycelium and mushrooms for a moment. Mycelium is the vegetative part of a fungus, consisting of a network of thread-like structures called hyphae. When a spore lands on a suitable ‘substrate’ they germinate and produce hyphae to ‘colonise the substrate’. This is where the hyphae consume the organic matter it’s attached to until the nutrient levels are completely depleted and the host is ‘saturated’ with the hyphae and mycelium. Once the substrate has been entirely drained of resources the fruiting body of the mycelium will form. The caps and stems we recognise as mushrooms are the fruiting bodies of these networks.&lt;/p&gt;
&lt;p&gt;The final and triumphant mushroom is our evidence that the substrate it sits in has been fully consumed by the underground mycelium. The mushroom is the output of an invisible process.&lt;/p&gt;
&lt;p&gt;Similarly, when we suggest that our phone ‘has to be listening to us’ to know our interests so well, we are only pointing at the mushroom. We have successfully identified the fruiting body of our total digital colonisation - but do not yet understand that the mycelium has taken root. We are the substrate and the fungus has successfully mapped and identified us, creating an accurate data profile.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;This data profile (devastatingly) is sophisticated enough to begin predicting the conversations we may have, and who we may be having them with. Pointing at these adverts and suggesting that the phones are recording us is akin to pointing at a mushroom and suggesting we save the substrate. The substrate has been consumed, you are only observing the outcome.&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;People (myself included) have a tendency to think they are exempt from the workings of the machine, and that we are engaging sustainably with social media at a healthy distance. Sudden and unexpected accuracy of targeted advertising is a jarring reminder that this simply isn’t true. The moment of bafflement we experience when we look down and see such a precise marketisation of our own interests should always serve as a warning; We are the substrate! We have been successfully digitally colonised, the mushroom has bloomed.&lt;/p&gt;
&lt;p&gt;It would be so much less worrisome if social media privacy abuse was as simple as recording you and spitting adverts back at you, but it is far more sophisticated and resource efficient than that. We are correct to assume that our phones ‘observe us’. It’s not using the microphone though, it’s certainly not ‘interacting’ with you, it’s just accepting everything that you’re offering up to it.&lt;/p&gt;
&lt;p&gt;This observation is not a dialogic process, and not one that mimics a human conversation: you are simply being consumed. If you find that uncomfortable then you need to stop feeding the fungus.&lt;/p&gt;
]]></content>
  </entry>
  
  <entry>
    <title>smartphones and children</title>
    <link href="https://pistolas.co.uk/smartphones-and-children/" rel="alternate" type="text/html"/>
    <id>https://pistolas.co.uk/smartphones-and-children/</id>
    <published>2024-09-25T00:00:00Z</published>
    <updated>2024-09-25T00:00:00Z</updated>
    <summary>Reflecting on the landscape of children using smartphones</summary>
    <content type="html"><![CDATA[&lt;p&gt;&lt;i&gt; reflections after reading ‘&lt;a href=&quot;https://www.theguardian.com/technology/2024/sep/23/children-who-dont-have-smartphones&quot;&gt;Only 3% of UK 12-year-olds don’t have a smartphone. Here is how four of them feel about it&lt;/a&gt;’ on the Grauniad. &lt;/i&gt;&lt;/p&gt;
&lt;p&gt;Whenever we talk about smartphones and their impact we discuss the situation forgetting that “It is what it is” is actually “It is what the tech monopolists have spent billions making sure it is”. The smartphone itself is a wonderful tool that uses radio waves to send signals between endpoints in a rather vast network. The issue of course is going to be what is sent back and forth, and any mediums that exist in those spaces that have incentives outside of communication that mirrors or assists communication in the physical world.&lt;/p&gt;
&lt;p&gt;No more often is this evident than in the conversations had by adults about how children use technology. In these conversations, the smartphone is inseparable from the services it enables. You can’t fault people for bundling the two into one, as for most the former is simply the physical prerequisite for the latter. I am always struck by just how much compromise we have to make with these devices; if you want your child to be able to call for an ambulance, keep in touch with their friends digitally, be able to call you to let you know to come and pick them up, then you must also expect to fork over the right to access a large swathe of their developmental years to a digital monopolist who has a direct profit incentive to make them miserable, insecure, alienated, and unhealthy.&lt;/p&gt;
&lt;h3&gt;internal behaviour between peers&lt;/h3&gt;
&lt;p&gt;These internal social behaviours and the digital spaces they occur in are not essential characteristics of the social lives of children, but instead the consequences of anti-social design. Snapchat and Instagram game-ify every single facet of communication, with scoring systems on every interaction, and a lack of accountability built into the platforms. This allows and encourages hit-and-run dopamine hits on your peers without suitable mechanisms for recognising harm, facilitating apologies and return to group wellbeing. They are systems that produce individuated and individuating young people, with no infrastructure for responsibility or collectivism in the software that enables the majority of their communication. In person communication then becomes a byproduct of their digital life, a reaction to the conversations spun up in broken-by-design apps.&lt;/p&gt;
&lt;h3&gt;external influences on children&lt;/h3&gt;
&lt;p&gt;Equally as egregious is the allowance for a child to be bombarded by consumerist forces as soon as they are given a device that facilitates the development of a digital identity. Advertisements baked into every single platform dictate to the child who they will fail to be until they look or act a certain way. Swamped in the individuating infrastructure of the modern internet, the only economy of change is in what you and yours buy, and how you and yours present it online. Influencers and other forms of soft-power serving global economic commercial interests spend more deep engaged, face-to-face time with children than their parents, embedding rhetoric that is not subject to accountability from fully developed adults, while ensuring a faith-like commitment to a set of ideals or ideas that are realised online but devoid of all authorship or critical thinking.&lt;/p&gt;
&lt;h3&gt;soft problems, hard solutions&lt;/h3&gt;
&lt;p&gt;The issues with devices are not in their capabilities, but in the realisation of a technology that could be designed to enable and enhance the very best parts of childhood, subduing or challenging those difficult components that we wrestle with today. It is trivial to limit devices at the hardware level such that applications like snapchat and instagram are not usable without meaningful evasion and alteration from the user. Even then, a suitable logging mechanism can ensure accountability for such efforts. These apps and the men who make them have sat at our children’s table without asking, and they should be told to leave - quite as they would in the physical world. When we discuss ‘analogue’ parenting with our communities, we arrive at a consensus and make changes to our physical world to protect and empower our kids, and the same can be true for the digital world.&lt;/p&gt;
&lt;p&gt;We ought to produce a mandate for the sanctity of childhood, and not let a commercial market for consumer electronics be metabolised as an unchangeable facet of the modern world that children must at some point be fed in to. It is quite possible to produce child-friendly electronics, and the project of doing so does not need to be one where snapchat gets to sit at the table.&lt;/p&gt;
]]></content>
  </entry>
  
</feed>
