Same, though in my case my sole use is the IR. Replaced so many random little remotes from random little light up things.
Same, though in my case my sole use is the IR. Replaced so many random little remotes from random little light up things.
Now would be a good time to look for a .com
you like, or one of the more common TLDs. And register it at Namecheap, Porkbun, or Cloudflare. (Cloudflare is cheapest but all-eggs-in-one-basket is a concern for some.)
Sadly, all the cheap or fun TLDs have a habit of being blocked wholesale, either because the cheap ones are overused by bad actors or because corporate IT just blacklists “abnormal” TLDs (or only whitelists the old ones?) because it’s “easy security”.
Notably, XYZ also does that 1.111B initiative, selling numbered domains for 99¢, further feeding the affordability for bad actors and justifying a flat out sinkhole of the entire TLD.
I got a three character XYZ to use as a personal link shortener. Half the people I used it with said it was blocked at school or work. My longer COM poses no issue.
Is there a list anywhere of this and other settings and features that could/should certainly be changed to better Firefox privacy?
Other than that I’m not sure I’m really going to jump ship. I think I’m getting too old for the “clunkiness” that comes with trying to use third party/self hosted alternatives to replace features that ultimately break the privacy angle, or to add them to barebones privacy focused browsers. Containers and profile/bookmark syncing, for example. But if there’s a list of switches I can flip to turn off the most egregious things, that would be good for today.
Forgive me, I’m no AI expert to fully compare the needed tokens per second measurement to relate to the average query Siri might handle, but I will say this:
Even in your article, only the largest model ran at 8/tps, others ran much faster, and none of these were optimized for a task, just benchmarking.
Would it be impossible for Apple to be running an optimized model specific to expected mobile tasks, and leverage their own hardware more efficiently than we can, to meet their needs?
I imagine they cut out most worldly knowledge etc/use a lightweight model, which is why there is still a need to link to ChatGPT or Apple for some requests, would this let them trim Siri down to perform well enough on phones for most requests? They also advertised launching AI on M1-2 chip devices, which are not M3-Max either…
Onboard AI chips will allow this to be local.
Phones do not have the power to ~~~
Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer? Gotta have those latest and greatest chips.
It will be fun to see how it all shakes out. If the AI can’t run most queries on the phone with all this advertising of local processing…there’ll be one hell of a lawsuit coming up.
EDIT: Finished looking for what I thought I remembered…
Additionally, Siri has been locally processed since iOS 15.
https://www.macrumors.com/how-to/use-on-device-siri-iphone-ipad/
I think there’s a larger picture at play here that is being missed.
Getting the weather is a standard feature for years now. Nothing AI about it.
What is “AI” is, Hey Siri, what is the weather at my daughter’s recital coming up?
The AI processing, calculated on-device if what they claim is true, is:
Well {Your phone contact name}, it looks like it will {remote weather response} during your {calendar event from phone} with {daughter from contacts} on {event date}.
That is the idea between on-device and cloud processing. The phone already has your contacts and calendar and does that work offline rather than educating an online server about your family, events and location, and requests the bare minimum from the internet, in this case nothing more than if you opened the weather app yourself and put in a zip code.
Plug it into a monitor or TV and keep an eye on the console.
I have an older NUC that will not cooperate with certain brands of NVMe drive under PVE…the issue sounds like yours where it would work for an arbitrary amount of time before crashing the file system, attempting to remount read-only and rendering the system inert and unable to handle changes like plugging a monitor in later, yet it would still be “on”.
I recommend Dockge over Portainer if you want a web admin panel. https://github.com/louislam/dockge
It’s basically docker compose in a website, and you can just decide one day to turn it off and use the compose files directly. No proprietary databases or other weirdness.
I agree cash is the right idea, for now, but can you say for sure cash payment will be possible forever, or even the next 50 years? Wouldn’t it be better to blunder around with new ideas while cash is still a good fallback? Not saying I like crypto, and the cost on resources and the environment sucks bad, but I can at least appreciate them trying something. Now we just need to come up with sustainable options…
I get that cash seems a pretty durable idea, and it’s lasted for hundreds of years, but it did so before the massive societal turn towards technology we’ve made in the last 30 years.
Do you game at all? Gaming on Linux has made great strides, be be fair, but for a lot of titles you still need to consider a dual boot of some form of Windows, thanks to over the top anti-cheat, DRM, and developer support.
Something to consider for the gamers out there.
Absolutely. You can even throw the telephone in there. At the start it was a great way to reach Grandma across the country or the doctor across town. Now most of the traffic on it is robots and extortionists trying to fool Grammy into giving her money for some lie or another.
I don’t even answer my phone for numbers anymore…be on a short named contact list, leave a voicemail reminding me you are someone I should put on that list, or nothing doing. Sucks for anyone putting me down as an emergency contact though…
And I feel TV being 25% ads is being pretty conservative…oh, but streaming! Swap the ads and channels you don’t want for a higher per-channel price and no ads…oh, wait, now you get a higher price and the ads!
There’s a whole lot going towards ending the web as we know it.
Censorship, consolidation, AI, greed, to name a few.
Why, I couldn’t even get into the article before it faded into a paywall.
I get people want to be paid but splashing cash on every page is not the internet as I knew it.
Getting to this article from a social site(Lemmy) was also not how I knew it, that’s the consolidation part. After MySpace, in the era of Facebook pages it started. Less personal websites, less websites in general, just get everything from Facebook and Reddit.
And sure, AI is also going to water down content, with prompts written by cheap corporate lackeys that we will still have to pay subs for after a social site sends us there.
And then there’s also the censorship and laws coming out to restrict what’s available. First to protect the children while they are young, then more to “protect” them as they get older, and eventually they will know nothing but state approved media.
To quote the article,
It’s the End of the Web as We Know It.
And I’m old and bitter about it. It had good promise, but enshittification took hold as was inevitable.
Second this. I don’t believe the chef would care.
Whether all at once, over hours, for one table or six, all you are to the chef is plates to be filled. Except for timing a table’s dishes to send out at once they wouldn’t even care what table to go to, much less if the same customer is making repeat orders or a quick table turnaround on multiple customers. He gets his pay all the same either way.
No, I think this is solely with the server. Your choices annoyed her, and if there were tips involved even more so. Quicker you are in and out is the quicker you leave your tip and she gets another customer in to tip, which depending on your location could be very important to her livelihood.
I wonder if it can be detected by the streaming apps. Some of them are really anal about ensuring you can’t record or whatever, and don’t work if it doesn’t get all the HDMI security stuff just right. I’ve had issues with bad cables and my portable projector(Anker) has to side load an alt version of Netflix because they couldn’t/wouldn’t get the device to pass Netflix “certification”.
I’m guessing this means new partnerships and money changing hands, or nobody on a Roku can watch Netflix anymore, or they put these ads at a higher level that bypasses whatever security/DRM Netflix uses. Probably the last one, but if Netflix thinks they will lost money to this they’ll probably just pull their certification anyway.
I’ll take a compromise where “3.1” is etched in each head end, and I can trust that “3.1” means something, and start with that.
The real crux of the issue is that there is no way to identify the ability of a port or cable without trying it, and even if labeled there is/was too much freedom to simply deviate and escape spec.
I grabbed a cable from my box to use with my docking station. Short length, hefty girth, firm head ends, certainly felt like a featured video/data/Dock cable…it did not work. I did work with my numpad/USB-A port bus thing though, so it had some data ability(did not test if it was 2.0 or 3.0). The cable that DID work with my docking station was actually a much thinner, weaker feeling one from a portable monitor I also had. So you can’t even judge by wiring density.
And now we have companies using the port to deviate from spec completely, like the Raspberry Pi 5 technically using USB-C, but at a power level unsupported by spec. Or my video glasses that use USB-C connections all over, with a proprietary design that ensures only their products work together.
Universal appearance, non-universal function, universal confusion.
I hate it. At least with HDMI, RCA, 3.5mm, Micro-USB…I could readily identify what a port and plug was good for, and 99/100 the unknown origin random wires I had in a box worked just fine.
Actually, that leads me to another point:
One upon a time, the concept behind a universal USB-C connector was so we could do exactly that.
Laptop? Phone? Camera? America? Germany? Japan? Power? Connect the to TV? Internet?
Wouldn’t matter anymore. USB-C to cover it all. Voltage high for the laptop, low for the camera, all available just the same in every country, universal. So yes, fill the airports and hotels with them. Use them for power and to play videos on the TV. Because we weren’t supposed to have to question the voltage or abilities of the ports and cables in use.
Did/will that future materialize?
I feel the only place for a €1 cable is met by those USB-A to C cables that you get with things for 5V charging. That’s it. And it’s very obvious what the limits on those are by the A plug end.
Anything that wants to be USB-C on both ends should be fully compatible with a marked spec, not indistinguishable from a 5V junk wire or freely cherry picking what they feel like paying for.
Simply marking on the cable itself what generation it’s good for would be a nice start, but the real issue is the cherry picking. The generation numbers don’t cover a wire that does maximum everything except video. Or a proprietary setup that does power transfer in excess of spec(Dell, Raspberry Pi 5). But they all have the same ends and lack of demarcation, leading to the confusion.
The worst part is, I could accept that as a generational flaw. The newer ones get better, the olds ones lying around do less. OK, that’s the beast of progress.
But no. They still make cables today that do power only. They still do cables that do everything except video. Why? Save a couple cents. Make dollars off multiple product lines, etc. Money.
What could have been the cable to end all cables…just continued to make USB a laughing stock of confusion.
Don’t even get me started on the device side implementations…
Depending on the channel, they weren’t wrong. And ironically, such channels are probably their favorites too.
Do as I say, not as I do and all that.
It’s not really because it fell over. It’s because it wasn’t supposed to fall over. Consumable launch materials don’t contend with this because failure to return is a success. This is a failure. This must be learned from and fought against/prevented going forward.