How Cloud Services Allow Tech Titans to Act as Knowledge and Information Gatekeepers

Yves here. Although a very tiny example of the thesis of article, your humble blogger can testify that there’s a great deal of commercial pressure to put data in the cloud. Not only do tech companies and software consultants recommend cloud backup and storage as if there’s no possible downside, but for some commercial services like website hosting, few vendors will offer local storage (as in your data and backups on dedicated servers that they control, but at different locations).

This article gives a fine overview of the dangers of cloud services to their customers, including dependence and exposure to predation.

Sadly, while the cloud services market is highly concentrated, regulators are behind the curve in recognizing its importance, let alone doing something about it.

By Cecilia Rikap, Lecturer and Program Director of the BSc, International Political Economy (IPE) at City, University of London; Tenure Researcher, CONICET;
Associate Researcher, COSTECH lab, Université de Technologie de Compiègne. Originally published at the Institute for New Economic Thinking website

At the end of July, Microsoft and Google’s parent company, Alphabet, presented their latest and relatively disappointing economic results blaming it on the macroeconomic distress. What may have gone unnoticed is that both companies referred to their clouds as the main engines of growth. The cloud was also responsible for Amazon’s better-than-expected quarterly results.

The cloud refers to computing services, including software, hardware, and platforms offered as services through the Internet instead of running locally on individual computers. By 2025, 45% of the world’s data storage will be on the cloud. We are constantly storing information and accessing online applications through the cloud.

Moving operations to the cloud is also crucial for companies. Yet, this mass transfer of information technology from inside organizations to the cloud is a very recent phenomenon. In 2012, firms spent just USD 6.5b on cloud infrastructure services; by 2021, investments had jumped to USD 178b (representing an increase of 2,638%). While the Cloud is used by all sorts of companies and public sector organizations, its ownership is overwhelmingly dominated by just three firms. Together, Amazon, Microsoft, and Google concentrate around 65% of cloud infrastructure services.

This market dominance matters more than concentration in other markets because it entrenches tech giants’ control of digital technologies, which reinforces their global power and the value they capture from other businesses in the form of intellectual rents paid to use those digital technologies. Companies developing specific artificial intelligence (AI) applications, for example, are dependent on tech giants’ cloud services including access to cleaned big databases to train their specific AI models. They also rent generic AI solutions, like facial recognition or autocomplete for written text, that are integrated into their targeted or specific applications.

Netflix recently stated that it relies on services provided by Amazon’s cloud (Amazon Web Services, AWS) and that it could not easily switch to another cloud provider. Other platform companies like Uber — which can only operate accessing Google Maps — and Booking made similar claims concerning technological dependence on big tech companies.

Since the same lines of code can be simultaneously used by many, the reproduction costs of selling AI algorithms as cloud services tend to zero. Hence, as Amazon, Microsoft, and Google expand their client base, profits increase exponentially, to the point where AWS is Amazon’s most profitable business. Furthermore, since the AI code rented as a service includes deep learning algorithms that learn as they process data, the more these algorithms are lent, the more they will learn and self-improve, thus reinforcing these three giants’ digital leadership.

And this is not all. The cloud offers tech giants a chance to sneak into (and copycat) thousands of organizations around the world. Like Amazon’s marketplace, AWS not only sells Amazon computing developments as services. AWS is also a platform that enables other companies to sell their own computing services. Among them, Elastic offered its products Elasticsearch and Kibana through AWS. As their popularity grew, AWS started offering its own version of these services, displacing Elastic from the market.

Cloud computing is also a strategic industry. It allows the identification of promising businesses early by identifying growth in companies’ consumption of data storage space and processing power as well as greater use of different AI services. As a result, the three market leaders use information gathered from their Clouds to identify, and eventually fund existingbusinesses, or start new promising ones. As other companies first fail or succeed, big tech companies thus reduce their investment risks while keeping long-term economic profits.

Microsoft’s acquisition of Nuance, a cloud-based system for medical transcription services, for USD 19.7 billion is a case in point. Nuance was already running services on Microsoft’s infrastructure before the acquisition. Acquiring Nuance represents a way to make a strong foothold in cloud services for the healthcare industry, which is a source of colossal datasets to be exploited with artificial intelligence. No wonder why, when the acquisition was announced, Microsoft’s CEO, Satya Nadella, tweeted: “AI is technology’s most important priority, and healthcare is its most urgent application.” Yet, the acquisition also expanded Microsoft’s intellectual monopoly beyond healthcare, reinforcing its overall business, in particular, its cloud because it gave Microsoft access to Nuance’s more than 1,000 patents and secretly kept knowledge that had placed the latter at the frontier of speech recognition.

The tendency towards market dominance that is premised on privileged access to data is exacerbated by the fact that the code underpinning cloud services is not made accessible to customers. Customers become ‘locked-in’ and dependent on the cloud services provided by the dominant cloud service providers. This constrains the opportunities for customers to learn by accessing the code they purchase as a cloud service. Customers know what certain services can be used for, but they cannot learn from the rented code since they cannot access the actual algorithms that are making those things happen. This, even if part of those algorithms were developed by universities and public research organizations.

This is true even when those customers are other major corporations. Siemens, for example, is the European leader when it comes to the number of AI patents it has been granted. But Siemens is also dependent on big tech cloud, including for the most advanced generic AI required to apply more specific applications that Siemens integrates into its medical imaging, energy, and transportation products. Only a year after launching Siemens MindSphere, a cloud platform for storing and analyzing data retrieved with IoT from its sold equipment, AWS took over part of this platform’s development. AWS provides computing services that Siemens cannot develop in-house and that it needs to provide AI-specific solutions to its clients.

This form of technological dependence is risky at least for two reasons. First, Google, Amazon, and Microsoft have already entered Siemens’s medical business with the potential of becoming serious rivals. Second, unlike the first ICT wave, where technology adopters could learn by using and adapting technologies leading to complementary innovations, cloud computing offers technology as a black box. Therefore, it limits users’ learning and generates a form of long-term technological dependence with no visible ways of moving beyond it. All this, while tech giants’ algorithms self-improve by processing the data harvested by companies like Siemens, thus further expanding the technological gap between cloud providers and other firms. As this technological dependence expands vis-à-vis the digital leadership of tech giants, Siemens may keep reducing its own development of MindSphere, relying instead on services accessed directly through tech giants’ clouds.

Siemens is one of the thousands of companies that are basing their digital transition on analytics, database and IoT provided as cloud services by tech giants. As the use of these forms of the platform as a service accelerates — they have the highest growth rate within the cloud infrastructure services market — we may expect the reinforcement of tech giants’ leadership based on expanding technology enclosures. As firms lose their technical autonomy and subordinate to cloud solutions, value transfers in the form of intellectual rents to tech giants paid to use but not really access digital technologies will expand. These rents deepen polarization among firms and, in turn, foster wealth and income inequalities. This scenario points to forms of economic power that elude existing regulatory frameworks.

Regulating the Cloud

The European Union’s Digital Markets Act is probably the world’s most advanced digital policy, thus the right place to look for cloud computing regulations. This Act aims to expand the EU’s bargaining power against core platform companies by unifying member states’ digital economy rules and carrying out market investigations at the EU level that may lead to sanctions for non-compliant behavior.

It is still to be seen whether the European Commission will be able to enforce this Act. Its past fines charged to Google for several antitrust cases — Google Shopping (2010), Google’s Android (2015), and Google AdSense (2016) — were never cashed. The European Commission also ruled against Apple and Ireland for illegal state aid through selective tax breaks, but the EU general court annulled the decision.

There is an additional limitation of special relevance for regulating the cloud. Although the Digital Markets Act identifies platforms’ potential role as gatekeepers even when they are not dominant in competition law terms, it remains focused on markets. Core platforms will be fined only if they are found to be market gatekeepers. For instance, if they systematically privilege their own products and place third-party ones in lower positions in customers’ searches on their platforms.

The term “cloud” appears only 14 times in the 193-page latest provisional public version of this legislation. The cloud is only introduced as an example of a platform with potential market gatekeepers. Not a word is said about how Amazon, Microsoft, and Google operate this business, expanding their knowledge appropriation while subordinating other organizations. These giants are not only market gatekeepers but also knowledge and information gatekeepers. If the European Commission and its member and other states seriously want to introduce legislation that can counterbalance these firms’ power, this form of gatekeeping must be prevented.

The emergence of private competition, as recognized by European leading corporations, is limited by the tangible capital — in particular infrastructure — intensive nature of the cloud. But even more challenging is the fact that competition is not the best solution for the cloud. Artificial intelligence algorithms sold as services in the cloud self-adjust and learn — thus improving — the more data they process. More competition would come at the expense of efficiency (each algorithm will process less data, thus producing less digital intelligence) and, therefore, potentially lower prices. This is a textbook case of what the economics literature calls a natural monopoly.

Just like other natural monopolies like electricity, accessing computing services on the cloud is becoming crucial for businesses. Yet, unlike electricity, whose main regulatory dimension is tariff regulation, prices are not the most sensitive side of the cloud business but knowledge and information. Since it is impossible to limit digital learning when processing third-party data, corporations should not be the main and certainly not the only cloud providers. On the contrary, a solution could be to build a cloud operated democratically as an international public consortium. This might be one way to effectively tackle global knowledge and information gatekeeping in the digital world. It could also set a precedent for knowledge and information sharing in other fields and set some real limits to monopoly pricing, not just by its direct effects on pricing, but by making codes more available.

Print Friendly, PDF & Email

30 comments

  1. digi_owl

    More and more it seems that that computing industry is split between those running a smaller, localized, operations that do not need much more than what Microsoft have been offering for decades by now, and massive international orgs that need to supervise thousands of computers spread across the world.

    And it is the latter that are dictating all development, that then subsequently complicates life for the smaller orgs and individuals.

  2. SocalJimObjects

    Just like everything in life, there’s pros and cons. One thing that’s very hard to do properly in tech is security, and this is where cloud services (if you know how to utilize them properly) are superior than thousands of small/medium firms “learning” to do security properly. “Learning” to do security is also a type of oxymoron. When there’s a security breach, the only ones learning stuff are the hackers as in they get to learn stuff they aren’t supposed to know. “But the developers will do better next time!!!” Not in my experience. Developers get paid when they deliver new features, and when there’s a time crunch, guess what gets ignored? Things like security testing.

    As to the limits of the cloud, Dropbox actually left AWS and built their own data centers. I actually interviewed with them one time, and they said that AWS was just too limiting for the kind of stuff they had in the pipeline, so no, cloud services aren’t magic either.

    Now my take on the whole security thing. The tech pioneers also ignored security when they started out, and as a result, security now gets handled in multiple places in the tech stack, some in the O/S which is pretty low level (remember there’s different O/S es out there), and some higher up the stack (antivirus, cloud services, etc), so you are never quite sure how secure your application is. In my opinion, someday there will be a reboot of the whole computing industry where security will be built in at a very low level from the beginning, and it will be harder to write programs as in not everyone can get a tech job just by copying and pasting code from Stack Overflow. Maybe, just maybe, cloud services will be obsolete then.

    1. Carolinian

      So would moving the world’s desktop terminals from Microsoft to Linux help with that security original sin? It would be interesting to know how many of the world’s malware and cybercrime problems track back to the Windows operating system with its many flaws. And it’s not just to Windows that they track but also Bill Gates favored intellectual property model of computing. Others were against this even from the beginning and the original point to the personal computing movement was to escape the clutches of IBM and centralized computing. Now those garage computer builders are all “two legs good, four legs bad” and offer up a cloud solution to their own mistakes. It was a revolution coopted, like so many others.

      And yet it is allowing us to carry on this very discussion. We as individuals can revolt against “the cloud.” Unfortunately it sounds like for businesses that must compete it will be a lot harder.

      1. SocalJimObjects

        “So would moving the world’s desktop terminals from Microsoft to Linux help with that security original sin?”

        Not really. The macOS used to be “secure”, and then of course something like this happened: https://www.wired.co.uk/article/macos-process-injection-flaw. Yes I know the macOS is not based on Linux, but bear with me here. There used to be a time when people found it not worth their time to hack the macOS, I know hard to believe right? Anyone my point is that, right now, not very many people use the desktop version of Linux, but once people start using it in greater numbers, the security flaws will emerge left and right.

        1. digi_owl

          Mac survived on security by obscurity.

          Basically only Americans used Mac for the longest time, making it a small target for as long as the goal was widest possible “ownage”.

          But after Jobs managed to turn Apple into a lifestyle brand, and criminals started dealing in identity theft, Mac became a far more juicy target. And has suffered accordingly.

      2. hunkerdown

        Real security systematically defends against specific threats. Ontological security opportunistically defends against insults to mushy proprietary feelings. Anyone selling feelings for cash is scamming you, in this as every department of life.

        That said, OS switches exchange one set of vulnerabilities for another set of vulnerabilities. systemd, the modern monolithic process life-cycle manager for desktop Linux, is surrounded by mildly suspicious circumstances and designed by a guy who sprang onto the scene writing privileged services in unchecked languages.

        I agree with SocalJim’s prediction that the industry will probably be rebooted to eliminate general purpose computing; that transition is already in progress. I don’t agree that it’s a good or worthwhile thing, or something not to fight every step of the way.

    2. scott s.

      I don’t agree that tech pioneers ignored security. If you look at Multics, it was intended as a secure timesharing OS. But I don’t know if the market aside from DoD was much interested. Even with all the money in DoD to throw at problems, it seemed kind of unsolvable. It seemed like all the physical security mechanisms surrounding the classification system ended up getting necked-down to just 3 levels, largely relying on air-gap.

      I don’t think the cloud services issue discussed here is so much about seat management, but about big data, the need for huge training set data and huge node sets.

      1. Dave in Austin

        And OS 2, the old IBM system, is so secure it is still used on many bank terminal system.

      2. SocalJimObjects

        You are thinking that security can only be done from the O/S level up. That’s just not true. The chip guys only paid attention to performance, energy and efficiency and completely ignored security.

        Also, you are talking as if the cloud providers can take data from Company A, B, C, and D anytime to create huge data sets. Perhaps they are doing that through backdoors, but that needs to be proven. Also, even if say Amazon has access to data from Company A, someone from the later will still need to explain the meaning of the data set. It’s not that straightforward. Mixing and matching junk will simply create more junk. As the saying goes, garbage in garbage out.

    3. hazelbrew

      such a technology centric view of security! it is so tempting to point the finger at “technology” as both the problem and the solution when it comes to being secure. and that is simply not true. Technology is just one part of an overall security stance / system/ posture.

      the softest part in any system are the people, social attacks, phishing etc.

      we’ve just been through ISO27l and soc2 at work – a good reminder that technology is only one part of this. and we get the message that cloud is “better” shouted at us by the platform we work on all the time (Atlassian as it happens). and its a simplistic marketing message designed to fool the customer. yes cloud services can more efficiently learn how to secure their own platform, but that is not the same as securing the whole system.

      and as for the drop box example – part of a reason for someone like dropbox building their own data centres is economics. past a certain scale the cost from AWS becomes uncompetitive and eats into your margin, and hence market cap. Dropbox saved $75m pre-IPO by doing this.
      I have linked before but this is an excellent read:
      A16z analysis of cost of cloud

      cloud is quick to start, but as you scale becomes a drain on finances.

      1. Polar Socialist

        I believe most of the security breaches even today are people using the system in a way they are not supposed to use it. As in checking records they don’t need to check, even if they are allowed to.

        The second most common breach at least used to be people printing sensitive information and then forgetting in the printer.

        This happens in law enforcement, health care, NSA, taxi companies, you name it.

        As somebody I know used to say, we can have secure system if we remove the network completely, there’s an armed guard next to every computer ready to detain people who do something they’re not supposed to do and everyone is strip searched when they leave the premises. In other words, we can’t have an absolutely secure system.

        Or, to put it another way: computer security is a process, not a state. You need to engage people in that process.

  3. Colonel Smithers

    Thank you, Yves.

    One organisation that is worried is the Bank of England. The central bank / regulator echoes the concerns you express in the first two paragraphs of your introduction and adds that these giants are cannibalising the financial system, able to dictate terms to customers and resistant to regulatory and commercial pressure due to size, being US firms and US defence contractors, and the lack of non-US alternatives.

    Financial services providers have additional obligations, but cloud providers treat them as if they are “mom and pop shops”. There was talk over the summer of bringing cloud providers into the scope of UK financial services regulations when they transact with UK financial services firms, but that idea has been put on the backburner as other events are keeping regulators and their political masters awake at night.

    Your final introductory paragraph is right to some extent. There are few regulators and, frankly, staff at clients (outside IT) who understand the issues. It’s also partly a matter of pay. Last December, two days after an EU27 bank offered me a job, the Bank of England’s financial / systemic stability team offered me the opportunity to supervise financial services firms and, in particular, their external contractors, which may be overseas affiliates, and ensure the stability and resilience of the UK financial system. I would much rather work for the Bank of England and am genuinely interested in the subject and want to do some public service, but Bank of England’s package varied between half to two thirds of what the EU27 firm offered, typical of the mismatch between officialdom and private sector firms. That difference is / was at the low end of the scale.

    Had I accepted the Bank of England offer, I would be Colonel Smithers now. I would love to join the old lady of Threadneedle Street and have kept lines of communication open.

    1. Thuto

      Thank you CS

      The problem as I see it is that regulation is part and parcel of a political apparatus meant to appeal to voters, that’s why regulators typically go for low-hanging fruits and soft targets in consumer tech like facebook because those can be more easily packaged into a “government fighting big tech on behalf of the common man” narrative and used as fodder to lure voters. Tech companies that enjoy monopoly power selling to the enterprise typically escape the scrutiny that comes with being in the regulator’s crosshairs (despite using its entrenched position in the enterprise to sell its inferior knockoffs of products from successful startups, when was the last time Microsoft provoked the ire of regulators and became the subject of any big investigation? Slack had to sell itself off to Salesforce to survive the frontal attack Microsoft Teams had launched against it. Zoom is next in its crosshairs, and in fact, it can be argued that MS has already parked its tanks on their lawn). Despite having the same deleterious effects that are the result of market power concentration as the likes of facebook, enterprise giants like Microsoft, AWS and Google Cloud still aren’t receiving the attention they deserve from regulators and only appear on the regulatory radar when they do large acquisitions. Imo this focus on political wins instead of doing the real work required to curb the excesses of monopolies with a proclivity to abuse their pricing power and distribution leverage, paired with the lack of skills and an inability to compete on remuneration for skilled workers is what renders current regulators largely impotent in their attempts to rein in the giants.

      The other area where big tech (and big business in general) leaves regulators eating dust is messaging. For decades corporate spin masters have branded regulation a headwind preventing innovators from innovating on behalf of consumers while pitching deregulation as a tailwind that will cast off the yoke of oppressive regulations from said innovators, allowing them in turn to bring to market innovations that benefit consumers and create common prosperity. Regulators have not been able to successfully counter this obviously misleading messaging and until this happens, (and they beef up their warchest to recruit and reward the right technical talent) progress is going to be slow.

  4. Godfree Roberts

    The European Union’s Digital Markets Act is probably the world’s most advanced digital policy,

    I think China’s is a step ahead, especially in its handling of platform abuses.

    Chinese cloud providers are being scrutinized now just as social platforms were last year, and to the same ends: putting providers and customers on a level playing field.

    1. hunkerdown

      Zoom out a bit. It’s a bezzle, but it’s also creating new capabilities for owners and guarantors of property to predict and counter threats to their perceived legitimacy, and to automate the reproduction of capitalism (the real value of this AI nonsense). This was also true for Web 1.0, during which rush several RDBMS packages and other statistical processing tools grew up rather quickly.

      I’m not sure that bezzles are new capitalist relations struggling to come into being, as a rule, but it’s an interesting heuristic.

  5. Glossolalia

    The mid-size software company I work for just recently finished a multi-year effort to migrate all of our infrastructure from servers we manage to AWS. As far as I can tell the costs savings are just not there. This is above my pay grade, but it seems crazy to me to commit yourself to a highly proprietary infrastructure that extracting yourself from is very difficult. AWS can basically start charging us anything they want and we’d be hard pressed to be able to move out of it.

    1. curlydan

      Exactly. Colleagues recently said that the cloud services we run at work were running slightly slower. Why? Because the compute time and number of processors were getting to be too expensive. So our processors were cut from 8 to 4 (or something like that) even though we were continually sold on the cost savings. Could we not see this coming from a mile away??? But I’m just an old guy ranting at work.

      1. Duke of Prunes

        You have to remember that “too expensive” is all relative in the corporate world. Your costs may have actually gone down with your initial cloud deployment vs. the old way, but now, manager has new goals to continue to improve financials so cloud makes it easy (sort of) to scale down. A few clicks of a button and, presto-chango, you have 4 cores instead of 8. The fact that productivity suffers because everything takes twice as long, not his/her problem (right now).

        1. digi_owl

          Capex vs opex seems to be the old saw that everything dies on.

          Best i can tell the cloud thing is one part “outsourcing” mania, one part making infrastructure and wages someone else’s problem.

          Microsoft these days is pushing heavily to move company networks onto their Azure hosted Active Directory services. Turning Windows into not much more than a very overweight “thin client”.

  6. BillS

    Cloud services are often linked to many “software as a service” (SaaS) applications. I work in hardware engineering, where we actually build stuff. Designing this stuff requires the use of design and modeling software. You are strongly encouraged by ads in trade publications to base your designs on subscription software services. These services often run in the cloud (meaning that you don’t even have said software on your computer, but some kind of interface client. At this point, you have lost complete control over the security of your data, since it is not scattered over lord-knows-how-many servers in lord-knows-what server farm. Companies like Cadence, Autodesk, Dassault Systems love this model, because once you get locked into their SaaS, there is no escape. Should you lapse in your subscription payments or should these companies make updates that are incompatable with older versions, you will not even be able to access your old designs. Anyone who works on long-term engineering projects should be deeply afraid of the effects that this business model will have on their project life cycle.
    IMHO
    1) SaaS and cloud services are first and foremost a rent-extraction scheme.
    2) these services represent grave security risks – you have lost control of your data.
    3) The obsolescence/incompatability risk is a nightmare for archiving/reusing old designs
    4) You have lost control of computing availability – what happens if the cloud servers are not available?
    5) Black box algorithms “dumb down” the design/simulation process. You have no idea what is really simulated, i.e. where errors can creep into simulations or if all parameters have been correctly incorporated.
    6) Customer service with SaaS/Cloud services is often beyond shocking. You pay 25-50k$ a year for a subscription and you end up in phone menus when you need help. And then, more often than not, the call center representative cannot get you an answer to your question.

    Where I work, we are using old versions of design software that we now are reliable and secure (on Linux systems). Open source software is also used often, where appropriate. For us, The SaaS and cloud models are too much of a risk. We are not the only ones. A couple of our large institutional customers have quietly admitted to funding in-house software development, mostly based on Open Source, to avoid being locked into ever more crapified SaaS.

    1. digi_owl

      That said, many of the same companies were very militant about copyright before SaaS. With things like hardware dongles being popular, and the threat of BSA audits ever looming.

  7. Ben

    The Irish government has admitted that with the existing and approved could servers will use 40% of the electricity generated in Ireland. What the will do to the electricity supply where the peat fired generators have been shut down and gas used to fill the gap is any ones guess.

    1. JBird4049

      If I understand this comment, Ireland, and by implication, the rest of Europe, could have a choice between keeping people alive or keeping businesses alive?

      Fun.

      1. digi_owl

        Not much of a choice really, as the server farms will have long standing contracts with the power companies for reliable supply.

        1. PlutoniumKun

          Big industrial users are almost always on contracts that allow power companies to restrict supplies during supply crunches. Most server farm usage is not time critical, and so they can power down for a few hours generally without a major impact. Many are located in Ireland precisely because the weather and power system permits this flexibility (unlike in hot countries where the maximum need for power comes during hot days when there is already a peak loading). In many respects, they benefit the Irish grid (up to a point) as their inherent flexibility provides an element of additional flexibility to the supply/demand match (always a huge problem in small island grids).

          So in Ireland those farms are not a particular issue for this winter. Its more a case that they have led to an unexpected overall increase in demand which will lead to a need for a major expansion in overall network capacity, something thats proven a bit of a struggle due to delays in roll-outs with off-shore wind farms and interconnectors to the UK and France. Ireland has vast potential wind and solar and generally is a net exporter of renewable energy. But at the moment the grid is hitting a point of major stress as demand has grown at a far faster rate than the grid can handle, and infrastructure investors are using this as leverage to try to squeeze more support from the government (for example, about 2GW capacity of fully permitted and contracted solar capacity is currently held up in wrangling over subsidies).

  8. Savita

    Another insidious aspect is Microsoft designing Windows operating system to be increasingly more reliant on the cloud. From reading things like Bruce Schneiers blog Schneier on Security, it seems the destination for Microsoft is for the local side to be a thin client, with the operating system and all content served up via the cloud. The user has diminishing control let alone ownership, and pays a licence fee for Windows operating system ‘service’ to be loaned. The article did not mention the massive amount of electricity the cloud consumes globally. And, only a percentage of that data is actually necessary and regularly accessed. As opposed to, electricity being spent simply to keep that data in the cloud and in cold storage unseen and unused by the operators. This is in the same arena as the US military being so responsible for destructive emissions. By this I mean, a gargantuan source of emissions, and one singularly overlooked

    1. digi_owl

      Yep. Windows 11 now insist home users sign up for a cloud login on first boot.

      And as part of that it also apply encryption to the C drive.

      Encryption tied to the cloud login.

      So if you should for any reason lose access to that login, you local files become permanently inaccessible.

      We thought phones would become pocket computers, but instead computers have become overized phones.

  9. Savita

    Further to my comment. Having now just read the enlightened commentariat I note Ben touched upon the electricity aspect of the cloud and BillS gave an excellent discourse on the perils of the clouds rent extractive ‘software as a service’.
    Much has been said over a long time on Schneier on Security about AWS cloud and its security risks. IIRC its data has already been breached and distributed into the wild.
    I will echo ‘hunkerdown’ comment about ‘system d’. Whatever Linux distro you use, be very sure you choose one that does not involve ‘system d’. Tell your friends and relatives and make a fridge magnet with the motto

  10. Dave

    AI is a good reference point.
    It is a challenging area.
    In practice, there is more to AI than ‘algorithms’ and cloud services.

    The so-called tech / surveillance sector have responsibilities if they are serious about AI.
    I suppose it’s worth reflecting on the development of frameworks like XAI.

    Explainable AI (XAI) is a methodology that has been built to deal with the issues that can arise when opaque output from computer programs is treated as useful data.
    In practice this opacity can be problematic and can lead to output from faulty algorithms and poor software being integrated into decision making processes.

    In the real world there can be a big difference between ‘stuff that came out of a computer‘ and useful ‘data’.

    An important goal of XAI is to provide ‘algorithmic accountability.
    This requires an open approach to building AI systems that goes beyond a closed black-box methodology, where the inputs and outputs are known but the algorithms that are used to arrive at a decision are not available for public scrutiny and are not necessarily understood by the people that developed the algorithms and the software that is used to build the black-box AI system.

    In practice, ‘I don’t how I got the answer’ isn’t a satisfactory response to a question about an AI system, even if the answer appears to be useful.

    Richard Feynman on ‘cargo cult’ science: “The first principle is that you must not fool yourself.”
    https://www.themarginalian.org/2012/06/08/richard-feynman-caltech-cargo-cult-science/

    We can extend this to the AI tech sphere, “you must not fool yourself”, even if you can make lots of $$$ out of fooling yourself and fooling others.

    The rise and rise of grot-tech led by people like American Reggie Perrin (Bill Gates), may generate lots of revenue but may ultimately lead to an NAI framework (NOT AI).

    The Fall and Rise of Reginald Perrin
    https://www.imdb.com/title/tt0576224/

    “Microsoft simply used bad algorithms to begin with, and it never bothered to replace them with good algorithms”

    “What We Don’t Know About Spreadsheet Errors Today: The Facts, Why We Don’t Believe Them, and What We Need to Do”

    Mistaken Identifiers: Gene name errors can be introduced inadvertently when using Excel in bioinformatics

    A short review of significant numerical issues that can arise when using Excel

    Spreadsheets in the Cloud – Not Ready Yet

    Sloppy nebulous tech in the cloud opens up a Pandora’s box of harmful nonsense that has the potential do a great deal of harm.

    AI is an area that comes with a duty of care and responsibilities to society and the environment, e.g.

    Google and Microsoft warn investors that bad AI could harm their brand

    (“As AI becomes more common, companies’ exposure to algorithmic blowback increases”)

    Energy and Policy Considerations for Deep Learning in NLP

    Creating an AI can be five times worse for the planet than a car

    Artificial Intelligence Can’t Think Without Polluting

    It’s important to demonstrate the viability of AI systems. Serious work in this field involves stuff like ahem ‘traceability’ and ‘transparency’. Beyond profits for Wall Street and ‘insiders’ like the upper echelons of the ‘tech sector’ what’s the point of wasting energy on stuff that doesn’t work?

Comments are closed.