They’ve done some amazing work.
made you look
They’ve done some amazing work.
It’s “FEX”, Valve have apparently been testing it with Proton.
The Asahi Linux team have their own packaging/tooling around it, but theirs is slower at runtime because they have to run the games inside a VM as well.
They’re investing in “green metal”, using their own renewable generation to produce hydrogen.
Whether or not it works out is another matter, but he (Andrew Forrest) seems to believe in it and is willing to put his money where his mouth is.
It supports it, but it’s opt-in by apps.
Enabling compression is another option (Though with a speed and size penalty), it’s user visible at least.
Hmm, for me it just says “This item is not available for purchase in your region”, not sure I know that currency.
Then don’t get me started about how the www subdomain itself no longer makes sense. I get that the system was designed long before HTTP and the WWW took over the internet as basically the default, but if we had known that in advance it would’ve made sense to not try to push www in front of all website domains throughout the 90"s and early 2000’s.
I have never understood why you can delegate a subdomain but not the root domain, I doubt it was a technical issue because they added support for it recently via SVCB
records (But maybe technical concerns were actually fixed in the decades since)
Chromium had it behind a flag for a while, but if there were security or serious enough performance concerns then it would make sense to remove it and wait for the jpeg-xl encoder/decoder situation to change.
Adobe announced they were supporting it (in Camera Raw), that’s when the Chrome team announced they were removing it (due to a “lack of industry interest”)
They’re “file like” in the sense that they’re exposed as an fd
, but they’re not exposed via the filesystem at all (Unlike e.g. unix sockets), and the existing API is just mapped over the sockets one (i.e. write()
instead of send()
, read()
instead of recv()
). There’s also a difference in how you create them, you open()
a file, but connect()
a socket, etc.
(As an aside, it turns out Bash has its own virtual file-based wrapper around sockets, so you can do things like cat
a remote port with Bash, something you can do natively in Plan 9)
Really it just shows that “everything is a file” didn’t stand up in practice, there’s more stuff that needs special treatment than doesn’t (e.g. Interacting with TTYs also has special APIs). It makes more sense to have a better dedicated API than a generic catch-all one.
RFC 3339 is a simplified profile of 8601 that only covers YYYY-MM-DD style formatting, if you only ever use that format and avoid the things like “2024-W36” they’re mostly interchangeable.
Plan 9 even extended the “everything is a file” philosophy to networking, unlike everybody else that used sockets instead.
Existing JPEG files (which are the vast, vast majority of images currently on the web and in people’s own libraries/catalogs) can be losslessly compressed even further with zero loss of quality. This alone means that there’s benefits to adoption, if nothing else for archival and serving old stuff.
Funny thing is, there was talk on the Chrome bug tracker of using just this ability transparently at the HTTP layer (like gzip/brotli compression), but they’re so set on pushing their AVIF format that they backed away from it.
For a game I don’t think it’s the end of the world, but you could end up in a situation where the first check passed, then you go to use the file and that fails, so you end up having to handle the “can’t use file” case twice anyway. But for something like showing a “Continue” menu item you obviously need to check that there’s an existing save to begin with before loading it.
In general checking first leads to race conditions known as “time-of-check to time-of-use”, the pitfalls of which can vary greatly, but realistically aren’t a problem for a lot of cases.
You can’t do normal BitTorrent in browsers, there’s no support for plain sockets that you’d need to communicate with other peers, WebTorrent is technically a new protocol that implements the BT semantics over stuff the browsers do provide (So you can proxy between the different swarms, that’s the “hybrid” nodes in the image on the WebTorrent page)
But it turns out it’s all a moot point, since PeerTube removed WebTorrent support anyway in favour of their own P2P system
Edit: Ok so I misunderstood, and it seems like it’s a bit complicated. The server can (it’s disabled by default) use WebTorrent to import videos, the client still uses the WT trackers to find peers but uses a different protocol to actually share the video data.
There’s this tool that provides the ability to automatically seed videos, but development has stalled because no up to date client will ever make use of it.
I think the one remaining use is the “download as torrent” option, but even then that’s just using a web seed, so it’s just an alternative way to download the video.
Unfortunately WebTorrent isn’t compatible with normal BitTorrent, so unless you’re using a client that specifically supports it, you’re not helping out any PeerTube clients
His personal LLC is called “Excession”, considering some of the plot points in that book I doubt he enjoyed it at all, it’s just “nerd set dressing”.
At the time it was just an ad-lib by Jason Issacs, guessing he wished on a monkey’s paw for it to make sense in context.
What’s the problem with that, though? Systems like that are pretty much guaranteed to be isolated from the internet.
Because things break down eventually, and when it comes time to buy replacement parts you discover that they’re effectively impossible to find. Then instead of having a nice, planned transition period you’ve got like a weekend to cobble together something to get it working again.
Yep, our center-left government recently announced plans to keep using natural gas for at least another 25 years
But it’s ok, because we’ll work out carbon capture in the future! Which is the exact same notion that our previous right wing government based their policy on.
It’s a tad out of date, but the Second Doctor claims he received a medical degree after studying under Joseph Lister in 1888.