• theluddite@lemmy.ml
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    22 hours ago

    I didn’t find the article particularly insightful but I don’t like your way of thinking about tech. Technology and society make each other together. Obviously, technology choices like mass transit vs cars shape our lives in ways that the pens example doesn’t help us explain. Similarly, society shapes the way that we make technology. Technology is constrained by the rules of the physical world, but that is an underconstraint. The leftover space (i.e. the vast majority) is the process through which we embed social values into the technology. To return to the example of mass transit vs cars, these obviously have different embedded values within them, which then go on to shape the world that we make around them.

    This way of thinking helps explain why computer technology specifically is so awful: Computers are shockingly general purpose in a way that has no parallel in physical products. This means that the underconstraining is more pronounced, so social values have an even more outsized say in how they get made. This is why every other software product is just the pure manifestation of capitalism in a way that a robotic arm could never be.

    edit to add that this argument is adapted from Andrew Feenberg’s “Transforming Technology”

    • hsdkfr734r@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      20 hours ago

      I like the way you argument but I’m not able to grasp what you try to say entirely. English isn’t my native language, this may play into it.

      Technology is constrained by the rules of the physical world, but that is an underconstraint.

      I. e this sentence.:) Would you rephrase it and give an additional example?

      I kind of get the mass transit vs. cars example. Although I think both options have their advantages and disadvantages. It becomes very apparent to me when… Lets say, when you give everyone a car and send them all together into rush hour and transform our cities into something well suited for cars but not so much for people. But that doesn’t make the wheel or the engine evil in itself.

      Also: The society and and it’s values affects technology which in turn affects the environment the society lives in. Yes, I get that when I think i.e. about the industrialisation in the 19th century.

      I struggle with the idea that a tool (like a computer) is bad because is too general purpose. Society hence the people and their values define how the tool is used. Would you elaborate on that? I’d like to understand the idea.

      • theluddite@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        ·
        17 hours ago

        No problem!

        Technology is constrained by the rules of the physical world, but that is an underconstraint.

        Example: Let’s say that there’s a factory, and the factory has a machine that makes whatever. The machine takes 2 people to operate. The thing needs to get made, so that limits the number of possible designs, but there are still many open questions like, for example, should the workers face each other or face away from each other? The boss might make them face away from each other, that way they don’t chat and get distracted. If the workers get to choose, they’d prefer to face each other to make the work more pleasant. In this way, the values of society are embedded in the design of the machine itself.

        I struggle with the idea that a tool (like a computer) is bad because is too general purpose. Society hence the people and their values define how the tool is used. Would you elaborate on that? I’d like to understand the idea.

        I love computers! It’s not that they’re bad, but that, because they’re so general purpose, more cultural values get embedded. Like in the example above, there are decisions that aren’t determined by the goals of what you’re trying to accomplish, but because computers are so much more open ended than physical robots, there are more decisions like that, and you have even more leeway in how they’re decided.

        I agree with you that good/evil is not a productive way to think about it, just like I don’t think neutrality is right either. Instead, I think that our technology contains within it a reflection of who got to make those many design decisions, like which direction should the workers sit. These decisions accumulate. I personally think that capitalism sucks, so technology under capitalism, after a few hundred years, also sucks, since that technology contains within it hundreds of years of capitalist decision-making.

        • hsdkfr734r@feddit.nl
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 hours ago

          factory example

          Thanks. I think I get it now. Besides physical constraints (availability of resources, natural laws and the knowledge of them), society’s inherent values and rules (like work safety, minimum wage, worth attributed to a group of people/ the environment / animals) affect the way things are done.

          If work force is cheap and abundantly available and the workers’ health or wellbeing isn’t considered as too relevant the resulting solution to achieve something is very different from one with different preconditions.

          computers … because they’re so general purpose, more cultural values get embedded. Like in the example above, there are decisions that aren’t determined by the goals of what you’re trying to accomplish, but because computers are so much more open ended than physical robots, there are more decisions like that, and you have even more leeway in how they’re decided.

          The moral/ social/ economic decisions which are made are affected by the opportunities which a technology has to offer? OK, yes.
          The versatility of computer technology makes it a tech which can be used in many harmful ways. The potential for harm is bigger than let’s say with the invention of the wheel or the plow but not as big as with nuclear fission.

          Responsibility for the usage of a technology and finding common rules for its usage and enforcing them… hmm.

          Technology and what we do with it can’t be viewed as independent aspects?

          • theluddite@lemmy.ml
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 hours ago

            I’d say that’s mostly right, but it’s less about opportunities, and more about design. To return to the example of the factory: Let’s say that there was a communist revolution and the workers now own the factory. The machines still have them facing away from each other. If they want to face each other, they’ll have to rebuild the machine. The values of the old system are literally physically present in the machine.

            So it’s not that you can do different things with a technology based on your values, but that different values produce technology differently. This actually limits future possibilities. Those workers physically cannot face each other on that machine, even if they want to use it that way. The past’s values are frozen in that machine.