home.social

#iteration — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #iteration, aggregated by home.social.

  1. 💡 #UnityTips for beginners: Create a prototype to test your game idea! 🔬 Create a simple prototype of your game concept to test the mechanics, gameplay, and user experience. Refine and iterate until you have a polished product. #unity #unity3d #gamedev #prototype #iteration #IndieDev #IndieGame

  2. 💡 #UnityTips for beginners: Create a prototype to test your game idea! 🔬 Create a simple prototype of your game concept to test the mechanics, gameplay, and user experience. Refine and iterate until you have a polished product. #unity #unity3d #gamedev #prototype #iteration #IndieDev #IndieGame

  3. I couldn't find a copy of the memo named "Loop Iteration Macro" by Glenn Burke and David Moon, January 1981 (MIT/LCS/TM-169) at MIT's dSpace site, or anywhere else. So I scanned in my copy and have uploaded it to my web site.

    NOTE WELL: This document was written prior to CLTL and describes a facility that was available in MACLISP and the Lisp Machine's Zetalisp. Common Lisp drew design ideas from this, but the syntax, semantics, and associated functions/macros described in this are NOT the same as what Common Lisp offers.

    For example, my recollection from long ago (which I did not re-check before making this post) is that there are other differences in syntax because this earlier version of Loop was underconstrained in the ordering of the keywords in a way that let you write some expressions that the committee felt might confuse people with their results.

    But also, for reasons that slip my mind, Common Lisp did not adopt the define-loop-path macro that is described starting on page 19.

    nhplace.com/kent/History/macli

    #lisp #maclisp #loop #iteration #ComputerHistory #KentsHistoryProject #lisp #LispM #Zetalisp #CommonLisp

    cc @screwlisp

  4. I totally forgot that I mentioned #iteration to point out that I work in #series. It's not just a #MissKitty thing. It's pretty common in #digital #art. The way I plan to do #digital #exclusivity is the #nft gets all versions. I like to say they are all there, but you can't see them all at once. 🔦🔦🔦

  5. Atomos, Firmware, and Fraud

    A reasonable person would conclude, after looking at this Sony FX3A page on the Atomos site and reading the word “Yes,” that the “Touch to Focus” function on the Shinobi II monitor works with the Sony FX3A camera. And a reasonable person might then rely on that “Yes,” that affirmation, that assurance by the manufacturer, to make a decision. A reasonable person might decide, say, to purchase a Shinobi II monitor in order to take advantage of its Touch to Focus feature. But a reasonable person would be wrong to do so.

    And wrong in a very specific way: the reasonable person would not have taken into account that the word “Yes” here means “No,” or, more precisely, “No, actually, not at the moment, but take our word for it, we’ll eventually get around to it.” That is, the Touch to Focus feature is not usable now, not with the FX3A, but the people at Atomos tell me (in the passive voice, almost as if Atomos has no say in the matter) that “the Shinobi II is expected to receive…support” for Touch to Focus “soon.”

    The problem is firmware. In an email, Atomos tells me:

    The Ninja RAW, Ninja TX Go, and Ninja TX running firmware version 12.5.1 have received support for camera control and touch focus tracking for several new cameras, including the Sony FX3A. This support is not implemented on Shinobi II yet.

    Shinobi II is expected to receive this support in upcoming firmware soon.

    What “soon” means is anyone’s guess. In mid-October of 2025, Atomos told another (presumably reasonable) person that “Touch to focus support” for the FX3A “has already been added to the feature request list for future consideration,” but at the time Atomos could offer “no official confirmation or timeline.” Nearly six months later, support still has not materialized, and Atomos is still “unable to confirm an exact date,” they tell me in another email. But “rest assured,” the “Atomos Team” adds, “the update is scheduled for release in firmware version 11.07.” The clear, positive assurance of “Yes” on which I (and other reasonable people) relied has been converted to another kind of assurance: a promise of good things to come, deferred to an upcoming release.

    Maybe there are people who buy camera equipment or other tech with the expectation that they won’t be able to use it now, or won’t be able to take advantage of all its features now, but might be able to do so at some point in the future. I’m not among them. Maybe that has to do with my age, my pecuniary habits, or with how I think reasonable people should act.

    In any case, it looks as if we still have a little way to go before firmware version 11.07. How long is unclear and Atomos is evasive or doesn’t want to be pinned down. The Shinobi II currently runs firmware 11.06.02, which was released on 27 January 2026. The release just before that, according to the company site, came out on 16 June 2025. So version 11.07 could come out next month or it could come out six months from now — or longer. There could be 11.06.03, .04, .05, and so on before that. So far as I can tell, there is no regularity to the releases, not even any consistency in the numbering of firmware releases published on the site (the list of releases jumps, for instance, from 11.05 to 11.06.02) — nothing, once again, to rely on.

    We’ve grown accustomed to tech companies rolling out features and support over time. Cory Doctorow has written and spoken about how this process can be abusive, entrapping people and contributing to what he calls enshittification; but iterative rollouts, proprietary lock-ins, right to repair, and other issues that Doctorow focuses on are not really the issues crying out for remedy here. The more immediate issue is the blatantly misleading information on the Atomos website. Touch to Focus? “Yes.” That’s good old-fashioned fraud. It doesn’t matter, from the customer’s perspective, if the company plans one day to include that feature in a software update; a reasonable person will assume that Yes means Yes.

    The law sees it that way, too, doesn’t it? Deceptive or misleading product descriptions like this one are among the Deceptive Acts or Practices covered by Federal Trade Commission Act Section 5. California state law and New York business law also have provisions covering deceptive practices and misleading statements, whether by commission or omission. (Other states do as well.) I don’t know if these laws have ever been tested against firmware releases (I should look), and I can’t say how plaintiffs would fare. Tech companies might want to take refuge in fine-print, heavily lawyered disclaimers, as Atomos does on its site, or argue that the product they are selling is both hardware and software, and that software is an iterative product, so product descriptions are really just promises companies make or break along the way. But that argument would make a mockery of the law, reinforce Doctorow’s point, and just further erode trust in tech companies.

    The question is what to do about all this. Like other Atomos customers, I am a member of a misled or, as I might prefer to put it, defrauded class. That’s a fact to consider, but it doesn’t solve my immediate problem. In the near term, I need to decide whether to return this monitor or hang on to it, wait for 11.07,** and hope that this time Atomos means what it says. I’m open to suggestions.*

    *PS 28 March 26: After trying some different camera setups this morning, I am leaning toward returning the Shinobi II, using my old but serviceable Feelworld F6 5.7″ monitor, and just doing without the touch focus feature on the monitor. It would be nice to have but it’s hardly essential.

    **PPS 6 April 26: Atomos just released firmware update 11.07, which supports tap to focus for the Sony FX3A. With B&H closed for Passover until Friday the 10th, I haven’t yet been able to return the monitor, so I am going to update the firmware, see if and how the Touch to Focus works, and then make a decision.

    PPS 11 April 26: Yesterday B&H Photo opened after its holiday recess, so I returned the monitor. The firmware update got the Touch to Focus feature working, with the camera set to flexible spot, but after playing with it a bit I decided it was not for me. It will probably work just fine for others. Anyway, the broader point about firmware, and how tech has trained us to buy and live with products that are captive to unpredictable and irregular updates of proprietary platforms, stands.

    Type your email…

    Subscribe=

    #AtomosShinobiII #deception #enshittification #filmmaking #fraud #FTC #iteration #iterativeDevelopment #promises #reasonableness #reliability #software #trust
  6. @stebbs

    Thanks for the suggestion! I added a raspberry pi and a camera to the car. It doesn't work very well right now, but I'll improve it. I made a video for it but since it's not that good, I made the video about the importance of iteration instead! : )

    Here's the video: https://youtu.be/9mMziE28QPU?si=hi_X-fqQVkA9cH31

    Future iterations (of the car and the YouTube video) will be much better!

    #iteration #hobbyist #rccars #robotics

  7. @stebbs

    Thanks for the suggestion! I added a raspberry pi and a camera to the car. It doesn't work very well right now, but I'll improve it. I made a video for it but since it's not that good, I made the video about the importance of iteration instead! : )

    Here's the video: https://youtu.be/9mMziE28QPU?si=hi_X-fqQVkA9cH31

    Future iterations (of the car and the YouTube video) will be much better!

    #iteration #hobbyist #rccars #robotics

  8. AI and the Cult of the Lazy Amateur

    Reading Time: 6 minutes

    We are all familiar with the phrase. We learn better when we write things by hand, rather than when we type them on the keyboard. For decades people have been against typing, saying "It's cold, it's dead, it's impersonal". Those same people will then say "you have awful handwriting" and yet still feel nostalgia for hand written notes. In the age of AI more is being lost.

    Ten Thousand Hours of Practice

    When people my age were growing up, if you wanted to sing, you needed to devote ten thousand hours to develop tone, timber and a good voice. I am not a singer, which you can tell by what I just wrote. The point is, to be a singer you needed to dedicate yourself to the craft.

    Photography

    Some might say that digital photography is much easier than analog, because if you make a mistake you see it instantly. You can also take plenty of photos until you get it right. With film cameras you have 36 photos per roll, and it's when you develop the film that you see if you made a mess or not.

    Video

    Years ago I had the VX-1E video camera, and it was replaced by the VX-2000, and then another after that. With broadcast cameras, and with video cameras from a different era the zoom was manual, not servo controlled. You could adjust aperture, focus, and the zoom manually. With newer cameras you lost more and more direct control. Control is now electronic, rather than tactile.

    Now, with DJI drones, and other camera gear cameras pan and tilt automatically to follow the action and a single operator can control four cameras at once via a console. A human is no longer behind each camera.

    We hear about AI taking jobs, but robotics replaced camera operators a decade or two ago.

    As if that wasn't enough, editing tools do more and more of the work, with the aim of replacing the professional video editor with an amateur influencer. This is where we get into Andrew Keen's cult of the Amateur and AI topic.

    Our mobile phones are now our cameras, and image stabilisation makes phone camera footage usable. This means we do away with the camera operator. With new video editing tools that automagically edit video, very badly for now, the need to learn Montage Theory is vanishing.

    If you watch YouTuber content, it's recognisable. They all use the same sound effects, the same transitions, the same lazy approach to presenting to camera. If any YouTube content was shown within a university or film school setting, it would be panned as kitsch, cliché and worse. On YouTube, where the cult of the amateur reigns, the new sub-culture has become the norm.

    I use the word sub-culture, because although I hate the style, it is popular, and those from newer generations than mine, appreciate and enjoy the style.

    When I was growing up I read the Film Sense by Sergei Eisenstein, and I read books about basic betacam work, basic editing, and many other books. Because editing systems would cost 30,000 CHF for a player, and 30,000 CHF for a recorder deck, and then yet more for the edit controller I learned to edit on paper, and in my mind, impatiently waiting to have access to edit suites.

    It's with the Miro DC 30+ and Adobe Premiere that I could really learn to edit, as well as with the DHR-1000 Sony edit deck. When I first bought Final Cut Studio it cost me 1600 CHF. A few years later I bought Final Cut Pro as an individual app for 300 CHF, and that license has been valid on every mac I have owned or used since then.

    As I look on Threads especially I see amateur video makers, more flatteringly for them, referred to as influencers, or YouTubers speaking about how long editing takes, and how hard it can be. The more you edit as a professional, the faster, and more efficient you become as an editor.

    Years ago I watched an editor called Jesus, editing on Avid Media Composer and he was really fast and efficient with the edit suite. He knew the edit suite inside and out. For him that edit suite was part of him. I have seen many professional editors who are like that, who know the tool inside and out.

    In contrast, with CapCut, and GoPro's edit suite, and Virb, and other editing packages, the edit suite does the work for people. They shoot material, and then the edit software prepares the rough cut, and they fine tune it.

    In some cases they use AI to tidy it up rather than learning how to do this themselves. They use AI to colour grade, and to add AI generated music, and more. In the end the art, and auteur aspect of editing is being lost, and replaced by something more generic.

    Books and Audiobooks

    I noticed on Audible that a lot of books that are included within the subscription have AI narration. For some, this might be welcome, but for me, if we pay as much as we do for audiobooks, they should be narrated by human beings, rather than AI.

    When I see that books are written by AI I don't see a point in paying for them. To me, the point of paying a person, is to have their vision and creativity, rather than paying for something generated by an LLM. If I wanted to read something generated by an LLM, I would pay for the LLM and provide it with my own prompts, and fine tune them.

    Why would I pay for content generated by an LLM, if I can provide similar prompts and generate it by myself, for myself.

    AI Content Farms and Spam

    A few weeks ago we saw posts about people generating podcasts and blog posts with AI. When people wanted to create blog farms, in the past, they were immediately flagged as spam. For some reason today, the desire to generate blog posts and podcasts with AI is acceptable and yet the question I ask is "If you can't take the time to create content with humans, then why should humans read, or listen to it? We are already innundated with human created content, without adding AIGC to the mix. AI Generated Crap.

    In the age of information overload it doesn't make sense to use AI t add to the noise. It makes sense for the opposite. No one benefits from AI adding to the noise.

    AI for Recommendations

    When I look at YouTube, and Google News, and other sites AI is used to recommend stories, but I feel that AI looks at the lowest common denominator, for everyone, rather than within our content niche. It would make sense for AI to learn from my habits, and recommend according to my interests, without using the interests of others to recommend content. Too often the recommendations are awful. At other times it's too narrow. It's not as tailored as it could be.

    AI for Upscaling videos

    In the early days of YouTube we uploaded videos that were as light as possible to YouTube because we lacked the bandwidth to upload bigger files, but also because people lacked the bandwidth to download larger videos. We have forgotten the times when we waited for videos to load.

    Sometimes we would press play, and then pause, and let the video load while we went to get something to drink or a snack, and then come back once the download was finished.

    One way to save time was to upload low resolution videos. The material might have been shot in 720*576 (SD), but we downscaled it for the sake of sharing on YouTube. People like me have our first videos dating to 2005 or so. Things have changed in 20 years.

    In broadcasting upscaling has been around for years, for SD footage to be broadcast today. Look at Fasier, Friends, and other series. Plenty of old series were shot in SD, and are now broadcast in HD, after being upconverted.

    YouTube is using AI to upconvert their back catalogue and although some people hate the idea, I see the value. A 240, or 320 video, on a 4k screen is a stamp. Some of my videos, by being upscaled, will gain a new lease of life.

    For me, upscaling is a good use of machine learning/LLM/AI technology because it fills a niche that, as humans, we would spend months or even years, trying to do manually.

    There is a chance that LLM/Machine learning will hallucinate and distort what people see, but at the moment low res videos can't be played full screen. In an ideal world we would revert to the edit files, and re-export them in higher resolutions using new technologies, but as things stand, YouTube requires less returning to archives.

    Archive Restoration

    When I used to listen to MacBreakWeeely I found it interesting to hear about how machine learning was being used to compare every frame of certain films, in order to do a digital restoration of old films, as well as their film archives.

    I have over a hundred tapes that I should digitise, and then restore. Beta SP, MiniDV, DVCAM and other tapes are great because information is stored on tape so if part of the tape fails, more of it is left. The drawback is that if you play old Beta SP tapes the sound is played more slowly than originally so it has to be cleaned up. The same is true of dropout with all of these tapes.

    Although the cult of the Amateur might say "We don't want our footage to be upscaled, because of hallucinations, upscaling so much footage will help train the models use to do a better and better job, and as we digitise and salvage raw footage from old digital video tapes, and analogue tapes, so we recover historic artifacts that could have been lost, or hard to watch.

    And Finally

    For me Ai/machine learning and large language models should be used to streamline the tedious parts of video, photography and more, without replacing human creativity. I don't think that writing a prompt is creative.

    With Photoprism, and Immich, we see how machine learning can be used to help sort and catalogue gigabytes of photos with locations, names of people, names of mountains and recognising colours, and seasons and more.

    I will value human generated content over AIGC anytime.

    #AI #craft #dedication #devotion #experience #iteration #kitsch #laziness #passion #skill #sloppy #stamina #technology

  9. @ChuckUFarley @david6677 yes, is making it more difficult with each new - they offer no documentation and all the Devs can do is attempt to . Needless to say that it’s time consuming and near impossible.

  10. "Experiment learn repeat. The only formula you need!" - Futurist Jim Carroll

    Wrap your head around the growth of knowledge - and what you need to enhance your knowledge to deal with its reality.

    You should know that you don't know what you need to know.

    You should know that you need to know it.

    And then, set out to know it.

    When you do so, know that you can only know if you get involved with it.

    Get involved with it.

    You'll probahttps://elk.zone/composebly fail.

    But that's ok because you've learned about it,

    So do it again.

    Put this on repeat.

    Compound what you know.

    That's what you need to know. Warren Buffet said it best:

    “Read 500 pages every day. That’s how knowledge works. It builds up, like compound interest. All of you can do it, but I guarantee not many of you will do it.”

    Call it just-in-time knowledge on steroids.

    #Experiment #Learning #Knowledge #Growth #Compounding #Iteration #Failure #Progress #Development #Mastery

    Original post: jimcarroll.com/2025/06/decodin

  11. Individuals using Artificial Intelligence to generate #design ideas perform well. But groups using #ParallelDesign methods—with or without AI—produce more of the top 10 ideas, writes NN/g (see link).

    I recall 1990s studies that found machine-mediated sketching (using tablet, stylus or mouse) lowered #creativity compared to sketching by hand on paper.

    How would a #group make their hand sketching available to their #AI #teammate …?

    nngroup.com/articles/ai-creati

    #ideation #iteration #generativeAI #UX

  12. Wir laden ein – Iteration 4.0! Die Jahresausstellung unseres Bachelorstudiengangs geht in die nächste Runde und wir würden uns freuen, möglichst vielen Personen einen Einblick in unser Studium geben zu können.

    Free entry for all!

    📅 Datum: 7.–8. Februar 2025
    📍 Ort: Schanzenstraße 22, Köln-Mülheim

    Mehr Infos unter 👉coco.study/iteration/

    #code #studieren #studiengang #köln #cologne #coco #thKoeln #university #uni #students #informatik #economy #cologne #germany #codeandcontext #iteration

  13. Agile was intended to address the problem of waterfall software development: delivering the wrong thing too late.

    When "Agile" teams only want to code something once – no acceptance that usability testing might reveal a failing that necessitates another iteration – it's just more waterfall development with Agile-flavoured rituals and ceremonies.

    #agile #dev #SoftwareDevelopment #waterfall #iteration #usability #UX #UXD #UR #UserResearch

  14. There is one more non-production cutover before production. Which means there is another chance to #rehearse. And another chance to #execute. And another chance to #learn. And another chance to feed the #lessons learned into the next #iteration.

  15. Studio #iteration 2. Shifted Push controller between the #moog #subharmonicon & #DFAM and the #modular palette. Making a bit more ergonomic sense after trying some work yesterday. Sadly that old apple keyboard (17 years old) has decided a few keys including the shift & space button won’t work. The old apple Cinema Display if same vintage is still going strong. Compared to modern displays I find it easier on the eye for working.

    #composer #studio #producer

  16. Studio #iteration 2. Shifted Push controller between the #moog #subharmonicon & #DFAM and the #modular palette. Making a bit more ergonomic sense after trying some work yesterday. Sadly that old apple keyboard (17 years old) has decided a few keys including the shift & space button won’t work. The old apple Cinema Display if same vintage is still going strong. Compared to modern displays I find it easier on the eye for working.

    #composer #studio #producer

  17. Studio #iteration 2. Shifted Push controller between the #moog #subharmonicon & #DFAM and the #modular palette. Making a bit more ergonomic sense after trying some work yesterday. Sadly that old apple keyboard (17 years old) has decided a few keys including the shift & space button won’t work. The old apple Cinema Display if same vintage is still going strong. Compared to modern displays I find it easier on the eye for working.

    #composer #studio #producer