I mean from a privacy perspective alone its clear that Meta throws its ethics out the door in that regard. There's the Cambridge Analytica scandal, the more recent incident with Instagram bypassing Android OS restrictions for more tracking, and many many other examples.
Their apps also regularly nag you to allow access to stuff like contacts and the photo gallery when you've already said no the first time.
And for a personal anecdote: I was recently helping a senior setup Whatsapp Desktop on her Windows computer. It could chat fine but refused to join calls, displaying an error that said there was no microphone connected. I mean, there is a mic connected and it could record voice notes fine. Turns out that error actually meant that there was no webcam connected, and a webcam is required to join calls. I think it's the same way in the mobile app where you need to give it the camera permission to join a video call even if you turn the video off. Meanwhile Zoom, Teams, Webex, and others allow you to join any call without a mic or camera.
As she didn't have a webcam I first tried the OBS virtual camera but Whatsapp refused to recognize that despite all other apps working fine with it. Somehow Droidcam with no phone connected worked fine, displaying a black screen in the virtual camera feed, and that got Whatsapp to join the call successfully. Absolutely ridiculous and it's clear to me how desperately they want that camera access and that sweet data.
See, this is why I made a comment in that Apple thread (see my post history) about stopping Facebook doing things like this. I was told "Android can do it too". Yes but no. Apple may do evil things but they punished Facebook for their bullshit, revoking their certificate. The landscape of contact info (phone numbers, email addresses, social media services, people just submitted it, they trust me, dumb f-) means you can't have bad faith actors like Zuckerberg Zucking about. Whatsapp is such a clear case of antitrust just for starters
Edit: sorry, not entirely clear, I mean we need Apple's system of granularity. "Deny access to contacts" needs to work even when the asking company (Facebook) tries tricking people
Personally I wonder even if the LLM hype dies down we'll get a new boom in terms of AI for robotics and the "digital twin" technology Nvidia has been hyping up to train them. That's going to need GPUs for both the ML component as well as 3D visualization. Robots haven't yet had their SD 1.1 or GPT-3 moment and we're still in the early days of Pythia, GPT-J, AI Dungeon, etc. in LLM speak.
That's going to tank the stock price though as that's a much smaller market than AI, though it's not going to kill the company. Hence why I'm talking about something like robotics which has a lot of opportunity to grow and make use of all those chips and datacenters they're building.
Now there's one thing with AR/VR that might need this kind of infrastructure though and that's basically AI driven games or Holodeck like stuff. Basically have the frames be generated rather than modeled and rendered traditionally.
Nvidia's not your average bear, they can walk and chew bubblegum at the same time. CUDA was developed off money made from GeForce products, and now RTX products are being subsidized by the money made on CUDA compute. If an enormous demand for efficient raster compute arises, Nvidia doesn't have to pivot much further than increasing their GPU supply.
Robotics is a bit of a "flying car" application that gets people to think outside the box. Right now, both Russia and Ukraine are using Nvidia hardware in drones and cruise missiles and C2 as well. The United States will join them if a peer conflict breaks out, and if push comes to shove then Europe will too. This is the kind of volatility that crazy people love to go long on.
I feel that the push will not be towards a general computing device though, but rather to a curated computing device sort of like the iPhone or iPad. Basically general in theory but actually vendor restricted inside a walled garden.
With improved cellular and possibly future satellite connectivity I feel that this would also be more of a thin client than a local first device, since companies want that recurring cloud subscription revenue over a single lump sum.
Keep in mind bitrot is a real thing if you roll your own storage. While most cloud storage solutions store multiple copies of your data I'm not sure if all of them have a system that checks for and fixes bitrot.
I love my ZFS server as it handles all that transparently but that's really not an option for everyone.
I think I have some bitrot in my photo collection, there are a few pictures that seem to be broken, but it's far less than 1%. I'm fine with it. I could probably restore most of those images if I tried.
After I got my server going I transferred all my photos over and ran a utility overnight to check them for corruption, the name escapes me but it was an open source cli program. A small number of images were corrupted and the majority were replaced with thankfully pristine backup copies. The rest were restored with minor visual glitches.
That's fine for the main article but I think there should be a way to get higher quality images should the reader request them. If power is a concern those can be hosted elsewhere.
I think it's acceptable for the drawings to be compressed this way but the photographs are very unclear.
The issue is that it's hacky, and in that case I'd rather go with a Intel or AMD x86 system with more or less out of the box Linux support. What we're looking for is a performant ARM system where Linux is a first class citizen.
It might be fun to have so many machines, but in reality it's simpler and cheaper to virtualize everything on two or three powerful hosts. Considering that you're using a soundproof rack already you might as well go with used rack servers with lots of memory and compute. Those also come with goodies like BMCs, dual PSUs, and ECC.
Personally I have two Xeon rack servers running in a Proxmox cluster with a SBC Qdevice. It has more than enough memory and compute for my needs and it also serves as my virtualized router and NAS. The whole setup only takes up 4U of space (servers + switch + modem/qdevice) with a single UPS on the floor, and idle power is around 150W.
We may be optimizing for different needs. For instance, while I was able to get a significant amount of extra height, I didn't have a lot of cabinet depth to work with, which is somewhat limiting for traditional server hardware. There are short-depth options out there, but I also wanted at lease some GPU capability. The integrated GPUs in the SER9s are not top of the line by any means, but they're more than capable for what I want to be working on.
9 degrees. arcsin(arccos(arctan(tan(cos(sin(9)))))) basically makes a set of sin-cos-tan layers that arctan-arccos-arcsin unwrap one-by-one, which should result in nothing having changed, unless the functions used weren't accurate.
There is no choice here - each inverse is uniquely determined. That's similar to how 3 and -3 are both square roots of 9 (i.e., solutions to x^2=9), but sqrt(9)=3 as it denotes the principal square root, which by convention is always the non-negative value. Of course, in a different context we might design functions to have multi-valued properties, like atan2(x,y) != atan(y/x) in general (atan2 takes quadrant in account and returns full range [-pi, pi], atan only returns principal values in [-pi/2, pi/2]) as practical applications benefit from preserving quadrant beyond just the principal inverse (or not failing when x=0!)
The inverse branches are not unique, you might think there is no choice being made but picking the standard branch is a choice b/c I can always shift the result by 2π by picking a different branch of the inverse. The answer is not unique & the assumption is that the calculators are using the standard branch.
Of course, but the choice is standard and thus the answer is 9. I can define a non-standard sqrt(x) which sometimes gives the positive root and sometimes the negative one, and then sqrt(sqrt(16)) could be -2 or undefined (if I defined sqrt(16)=-4) but that's just silly - the only reasonable interpretation for what the calculator should show for sqrt(sqrt(16)) is simply 2.
You can assume that sin(9) is within the range of all the functions that are post-composed w/ it so what you end up w/ in the end is arcsin(sin(9)). Naively you might think that's 9 but you have to be careful b/c the standard inverse branch of sin is defined to be [-1, 1] → [-π/2, π/2].
Edit: The assumption is that the calculators are using specific branches of the inverse functions but that's still a choice being made b/c the functions are periodic there are no unique choices of inverse functions. You have to pick a branch that is within the domain/range of periodicity.
arcsin(arccos(arctan(tan(cos(sin(9)))))) = 9 (in degrees mode - when regular trig functions output pure numbers, those numbers get interpreted as degrees for the next function and similar for inverses - calculator style), because each intermediate lands in the principal-value domain of the next inverse (e.g., arctan(tan(x)) = x when x \in (-90°, 90°) and the intermediates happen to be in those ranges). Specifically, sin(9°) ≈ 0.156434, cos(0.156434°) ≈ 0.999996, arctan(tan(0.999996°)) = 0.999996°, arccos(0.999996)≈0.156434°, arcsin(0.156434)≈9°.
Their apps also regularly nag you to allow access to stuff like contacts and the photo gallery when you've already said no the first time.
And for a personal anecdote: I was recently helping a senior setup Whatsapp Desktop on her Windows computer. It could chat fine but refused to join calls, displaying an error that said there was no microphone connected. I mean, there is a mic connected and it could record voice notes fine. Turns out that error actually meant that there was no webcam connected, and a webcam is required to join calls. I think it's the same way in the mobile app where you need to give it the camera permission to join a video call even if you turn the video off. Meanwhile Zoom, Teams, Webex, and others allow you to join any call without a mic or camera.
As she didn't have a webcam I first tried the OBS virtual camera but Whatsapp refused to recognize that despite all other apps working fine with it. Somehow Droidcam with no phone connected worked fine, displaying a black screen in the virtual camera feed, and that got Whatsapp to join the call successfully. Absolutely ridiculous and it's clear to me how desperately they want that camera access and that sweet data.
reply