• barsoap@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    9 months ago

    An AI being able to do that kind of analysis would be an AGI. Also: Garbage in, garbage out. Without knowledge of the system you cannot know what you actually want.

    Let’s take NIMBYs as an example: A municipality wants to drop parking minimums and fund public transport and start up a couple of medium-density housing/commercial developments around new tram stops in the suburbs, to fix their own finances (not having to subsidise infrastructure in low-density areas with high-density land taxes), as well as save money for suburbanites (cars are expensive and those tram stops are at most a short bike ride away from everywhere), and generally make the municipality nicer and more liveable. Suburbia is up in arms, because suburbanites are, well, not necessarily idiots but they don’t understand city planning.

    The issue here is not one of having time to read through statutes, but a) knowing what you want, b) trust in that decision-makers aren’t part of the “big public transport” conspiracy trying to kill car manufacturers and your god-given right as a god-damned individual to sit in a private steel box four hours a day while commuting and not even being able to play flappy bird while doing it.

    Even if your AI were able to see through all that and advise our NIMBYs there that the new development is in their interest – why would the NIMBYs trust it any more than politicians? Sure the scheme would save on steel and other resources but who’s to say the AI doesn’t want to use all that steel to produce more paper clips?

    Questions to answer here revolve around trust, the atomisation of society, and alienation. AI ain’t going to help with that.