I’ve messed with this dozens of times with various AI models that are generally good at abstractions with advanced prompting, custom diffusion settings outside of the typical, and some hacks in the model loader code. I seem to lack the vocabulary to describe the fundamental geometry of centrifugal gravity in the kind of depth required. Like how does one say that the road ahead curves up like a hill continuing overhead with the buildings anchored to…

I need language here that has unambiguous specificity and likely does not occur in any other context. Layperson verbosity will fail to get traction without fine tuning the model first. I prefer to explore what models can really do using settings and the prompts only.

Most of my vocabulary and understanding of geometry is limited to the Cartesian planes and CAD assemblies. Perhaps someone here has a better lexicon and doesn’t mind sharing.

(Post image is from a Blender rendered video someone posted on YouTube about what life in an O’Neill cylinder might look like)

  • Rivalarrival@lemmy.today
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    21 hours ago

    I think you’re looking for words like:

    • Prograde - In the direction of spin.

    • Retrograde - Against the direction of spin.

    • Nadirial or Anti-radial - Toward the center of rotation; “Up” in your centrifugal graviation model

    • Radial or Anti-Nadirial - Away from the center of rotation; “Down” in your model.

    Throwing a bowling ball “prograde”, it will experience greater “gravity” than normal. If you throw it retrograde, it will experience less gravity than normal, unless you throw it more than twice as fast as the prograde velocity, in which case it will experience more gravity.

    • xmunk@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      13 hours ago

      Hey now - any directional system not making use of Turnwise and Widdershins is one I want no part of.

      • TheRealKuni
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 hours ago

        Hey now - any directional system not making use of Turnwise and Widdershins is one I want no part of.

        And obviously going toward the main hub of the spacecraft should be called “Hubwards.” And away from the main hub, out toward the edge of space, we could call something like “Rimwards.”

    • neidu3@sh.itjust.worksM
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      21 hours ago

      Additionally:

      Normal and antinormal - perpendicular to prograde and retrograde.

      Source: well over 2000 hours in KSP

      • Rivalarrival@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 hours ago

        Correct. Orthogonal to both the prograde/retrograde and the radial/anti-radial axis.

        AFAIK, “normal” follows the right-hand rule. If you point your straight index finger prograde, and your thumb points radially, your middle finger, bent perpendicular to the other two, is “normal”.

    • j4k3@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      19 hours ago

      Interesting. I can get a small curve out of prompting with a straight road or sidewalk, the use of antinadiral gets a slight concave curve, like the term is weakly in the correct vector space but not powerful enough to bend buildings. That is some progress towards the required momentum. Thanks.

      • WoodScientist@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        23 hours ago

        Ah. AI slop. It is not a god. It is not a human being. It is an image generator It does not have the ability to generate things far outside of the space of visuals that already exist in its data set. And there simply aren’t enough simply aren’t enough of these visuals in its training data to create another one. It can’t create anything new. It can only create from the average.

      • SGforce@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        22 hours ago

        Diffusion models have a very limited understanding of language compared to modern LLMs like GPT4 or Claus, etc.

        https://huggingface.co/docs/transformers/model_doc/t5

        Most likely use something like Google’s t5 here. This is basically only meant to translate sentences into something a diffusion model understands. Even chatgpt is just going to formulate a prompt for a diffusion model in the same way and isn’t going to inherently give it any more contextual understanding.

        The simple answer is they are simply not there yet for understanding complex concepts. And I suspect that the most impressive images of impossible concepts they can drum up are mostly by chance or by numbers.

          • SGforce@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            15 hours ago

            Nevertheless, these models are trained with broad yet shallow data. As such, they are glorified tech demos meant to wet the appetite of businesses to generate high value customers who could further tune a model for a specific purpose. If you haven’t already, I suggest you do the same. Curate a very specific dataset and very clear examples. The models can already demonstrate the warping of different types of lenses. I think it would be very doable to train one to better reflect the curving geometry you’re looking for.