• 0 Posts
  • 20 Comments
Joined 2 years ago
cake
Cake day: June 16th, 2023

help-circle

  • Eh, I prefer KDE. It’s fairly uncluttered unless you actively mess with it and want it, whole Gnome is pretty ruthlessly “our way is the right way”.

    Once upon a time they only allowed virtual desktops to be in a column. Someone decided that columns weren’t for everyone so obviously make it only be in a row. Despite ages of most implementations supporting a grid layout.

    Window title search. This is fantastic for managing a lot of windows. I wish KDE could get better by using screen reader facilities to let you search window contents as well, but having the facility in show windows view at all is great.

    Their window tiling is less capable even than Microsoft windows.

    Any attempt to customize means extensions, and they seem to break the interfaces the extensions need constantly, and I had to face the reality that every update had me searching for a replacement extension because they broke one that want maintained anymore.

    But either way, the open desktop shells are better than the proprietary ones.


  • It’s also a good example of how an open source project manages to outmaneuver big company offerings.

    Home assistant just wants to make the stuff work. Whatever the stuff is, whoever makes it, do whatever it takes to make it work so long as there are users. Also to warn users when someone is difficult to support due to cloud lock in.

    All the proprietary stuff wants to force people to pay subscription and pay for their product or products that licensed the right to play with the ecosystem. So they needlessly make stuff cloud based, because that’s the way to take away user control. They won’t work with the device you want because that vendor didn’t pay up to work with that.

    Commercial solutions may have more resources to work with and that may be critical for some software, but they divert more of those resources toward self enrichment at the expense of the user.



  • I had been planning to, but being lazy about trying to enable my IDE setup but was giving it the benefit of the doubt. Your feedback resonates with how much I end up fighting auto-complete/auto-correct in normal language and seeing it potentially ruin current code completion (which sometimes I have to fight, but on balance it helps more than it annoys). I suppose I’ll still give it a shot, but with even more skepticism. I suppose maybe it can at least provide an OK draft of API documentation… Maybe sometimes…

    On the ‘vibe coding’, on the cases I’ve seen detailed, it seems they do something that, to them, is a magical outcome from technologies that intimidated them. However, it’s generally pretty entry level stuff for those familiar with the tools of the trade, things you can find already done dozens of time on github almost verbatim, with very light bespoke customization. Of course there is a market for this, think of all the ‘no code’/‘low code’ things striving to make approachable very basic apps that just end up worse than learning to code. As a project manager struggles to make a dashboard out of that sort of sensibility, a dashboard that really has no business being custom but tooling has fostered the concept that everyone has a snowflake dashboard, it’s a pain. But maybe AI can help them generate their dashboard. Of course, to be a human subjected to the workflows those PMs dream up is a nightmare. Bad enough already at my work there are hundreds of custom issue fields, a dozen issue types, and 50 issue states with maddening project to project unique workflows to connect the meaning of all this, don’t like AI emboldening people to customize further.

    The thing about ‘vibe coding’ is when they get stuck and they get confused/frustrated about why the LLM stopped getting them what they want. One story was someone vibe coding up a racing game. He likely marveled as his vision materialized. From typing prose without understanding how to code he got some sort of 3D game with cars and tracks and controls. This struck him as incredibly difficult otherwise, but reachable through ‘vibe coding’. Then he wanted to add tire marks when the player did something, maybe on a hard turn) and it utterly couldn’t do it. After all the super hard stuff, why could the LLM not do this conceptually much simpler thing? Ultimately spitting out that the person needed to develop the logic himself (claiming it was refraining to do it because it would be better for him to learn, but I’m wagering that’s the generated text after repeated attempts to generate code that the LLM just could not do).


  • I occasionally check what various code generators will do if I don’t immediately know the answer is almost always wrong, but recently it might have been correct, but surprisingly convoluted. It had it broken down into about 6 functions iterating through many steps to walk it through various intermediate forms. It seemed odd to me that such a likely operation was quite so involved, so I did a quick Internet search, ignored the AI generated result and saw the core language built-in designed to handle my use case directly. There was one detail that was not clear in the documentation, so I went back to the LLM to ask that question and it gave the exact wrong answer.

    I am willing to buy that with IDE integration or can probably have much richer function completion for small easy stuff I know to do and save some time, but I just haven’t gotten used to the idea of asking for help on things I already know how to do.





  • I’ve got mixed feelings on the CHIPS act.

    It was basically born out of a panic over a short term shortage. Like many industry observers accurately stated that the shortages will subside long before any of the CHIPS spending could even possibly make a difference. Then the tech companies will point to this as a reason not to spend the money they were given.

    That largely came to pass, with the potential exception of GPUs in the wake of the LLM craze.

    Of course, if you wanted to give the economy any hope for viable electronics while also massively screwing over imports, this would have been your shot. So it seems strategically at odds with the whole “make domestic manufucating happen” rhetoric.



  • This was roughly the state of affairs before but the state of things have relented where software password managers are now allowed to serve the purpose.

    So if a hardened security guy wants to only use his dedicated hardware token with registering backups, that’s possible.

    If a layman wants to use Google password manager to just take care of it, that’s fine too.

    Also much in between, using a phone instead of a yubikey like, using an offline password manager, etc.


  • The same reason that humanoid robots are useful

    Sex?

    The thing about this demonstration is that there’s a wide recognition that even humans don’t want to be forced to voice interactions, and this is a ridiculous scenario that resembles what the 50s might have imagined the future as being, while ignoring the better advances made along the way. Conversational is maddening way to get a lot of things done, particularly scheduling. So in this demo, a human had to conversationally tell an AI agent the requirements, and then an AI agent acoustically couples to another AI agent which actually has access to the actual scheduling system.

    So first, the coupling is stupid. If they recognize, then spout an API endpoint at the other end and take the conversation over IP.

    But the concept of two AI agents negotiating this is silly. If the user AI agent is in play, just let it access the system directly that the other agent is accessing. An AI agent may be able to efficiently facilitate this, but two only makes things less likely to work than one.

    You don’t need special robot lifts in your apartment building if the cleaning robots can just take the elevators.

    The cleaning robots even if not human shaped could easily take the normal elevators unless you got very weird in design. There’s a significantly good point that obsession with human styled robotics gets in the way of a lot of use cases.

    You don’t need to design APIs for scripts to access your website if the AI can just use a browser with a mouse and keyboard.

    The API access would greatly accelerate things even for AI. If you’ve ever done selenium based automation of a site, you know it’s so much slower and heavyweight than just interacting with the API directly. AI won’t speed this up. What should take a fraction of a second can turn into many minutes,and a large number of tokens at large enough scale (e.g. scraping a few hundred business web uis).





  • Except many many experts have said this is not why it happens. It cannot count letters in the incoming words. It doesn’t even know what “words” are. It has abstracted tokens by the time it’s being run through the model.

    It’s more like you don’t know the word strawberry, and instead you see: How many 'r’s in 🍓?

    And you respond with nonsense, because the relation between ‘r’ and 🍓 is nonsensical.



  • First I’m not sure if he’s autistic or just saying “I’ve got Asperger’s so I’m allowed to be an asshole and you just need to deal.”

    But let’s assume he is…

    High functioning autism is not associated with involuntary Nazi salute gestures. It’s also not associated with the inability to learn the significance of the gesture.

    So if autism is somehow related, then how would it be?

    Well the “nice” option is he is going full 4-chan troll mode thinking it is hilarious without processing just how bad it is.

    The other option is that he thought he did a credible cover to blow a dog whistle, but was unable to process that he was blowing a tuba instead.

    In short, even if it was because of autism, it still almost certainly means it’s still quite on purpose, so it’s hardly an excuse that makes things any better.


  • jj4211@lemmy.worldtomemes@lemmy.worldCan't wait!
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    Unfortunately, this time around the majority of AI build up are GPUs that are likely difficult to accomodate in a random build.

    If you want a GPU for graphics, well, many of them don’t even have video ports.

    If your use case doesn’t need those, well, you might not be able to reasonably power and cool the sorts of chips that are being bought up.

    The latest wrinkle is that a lot of that overbuying is likely to go towards Grace Blackwell, which is a standalone unit. Ironically despite being a product built around a GPU but needing a video port, their video port is driven by a non-nvidia chip.