• 1 Post
  • 1.92K Comments
Joined 3 years ago
cake
Cake day: June 15th, 2023

help-circle

  • So, what prediction did Bezos make back then, that seems particularly poignant right now? Bezos thinks that local PC hardware is antiquated, and that the future will revolve around cloud computing scenarios, where you rent your compute from companies like Amazon Web Services or Microsoft Azure.

    This isn’t a new idea, and it certainly predates Bezos.

    I’m older now, but throughout my life there has been a pendulum swing back and forth between local compute power vs remote compute power. The price of RAM going up follows the exact same path this has gone half a dozen times already in the last 50 years. Compute power gets cheap then it gets expensive, then it gets cheap again. Bezos’s statements are just the most recent example. He’s no prophet. This has just happened before, and it will revert again. Rinse repeat:

    • 1970s remote compute power: This couldn’t really compute anything locally and required dialing into a mainframe over an analog telephone line to access the remote computing power.

    • 1980s local compute power: CPUs got fast and cheap! Now you could do all your processing right on your desk without need of a central computer/mainframe

    • 1990s remote compute power: Thin clients! These were underpowered desktop units that could access the compute power in a server such as Citrix Winframe/Metaframe or SunOS (for SunRay thin clients). Honorable mention for retail type units like Microsoft WebTV which was the same concept with different hardware/software.

    • 2000s local compute power: This was the widespread adoption of desktop PCs with 3D graphics cards as a standard along with high power CPUs.

    • 2010s remote compute power: VDI appears! This is things like VMware Horizon or Citirix Virtual Desktop along with the launch of AWS for the first time.

    • 2020s local compute power: Powerful CPUs and massively fast GPUs are now now standard and affordable.

    • 2030s remote compute power…in the cloud…probably


  • Thw issue youll run into is effectiveness at that small scale, sonyoull be tempted to share data with other systems like that, and eventually you’ll end up creating a different flock.

    I wonder if a segregated system design could address this. Similar in-system segregation like a TPM for the actual detection/matching part of the system separated from the command and control part.

    As in, the camera and OCR operations would be in their own embedded system which could never receive code updates from the outside. Perhaps this is etched into the silicon SoC itself. Also on silicon would be a small NVRAM that could only hold requested license plate numbers (or a hash of them perhaps). This NVRAM would be WRITE ONLY. So it would never be able to be queried from outside the SOC. The raw camera feed would be wired to the SoC. The only input would be from an outside command and control system (still local to our SoC) that and administrator could send in new license plates numbers to search against. The output of the SoC would “Match found against License Plate X”. Even the time stamp would have to be applied by the outside command and control system.

    This would have some natural barriers against dragnet surveillance abuse.

    • It would never be possible to dump the license plates being searched for from the cameras themselves even by abusive admins. The only admin option would be to overwrite the list of what the camera is trying to match against.
    • The NVRAM that contains the match list could be intentionally sized small to perhaps a few hundreds plate numbers so that an abusive admin couldn’t simply generate every possible license plate combination effectively turning this back into a blanket surveillance tool. The NVRAM limit could be implemented as an on-die fuse link so that upon deployment the size could be made as small as needed for the use case.


  • So like… I feel scared about the idea of like… just going for a walk all by myself…

    How about making a list of the things you think would possibly happen to you going for a walk by yourself that would justify being rationally scared. Then go through the list and consider even if each event is possibly, how probable is it? I think you’ll find that that things you’re most afraid of are the least likely to happen.

    Now as a comparison, make a list of all the things that could happen to you staying at home. Another list of all the things that could happen to you being driven to your destination. Assign realistic probabilities to each event. I’m guessing you’ll find that the probabilities of bad things on each of these three list will all look pretty equal. If they are equal, then going for a walk is no more dangerous that staying home or being driven somewhere.

    In a sense, if you’re afraid to go for a walk, you should be equally or more afraid of going for a drive or staying at home. As such, its not more dangerous to go for a walk than the other option.





  • I feel like the problem here is that you get people who are curious or like the other features the fridge has and just get what they can when theirs goes out. And while, sure, those people learn not to do that again,

    Part of what makes us intelligent is learning from others. I guess I would expect buyers to do even the most basic research on a large dollar figure purchase which would likely expose them to the headlines about Samsung putting ads on fridges after the sale.

    Do people actually just walk into an appliance store and just drop more than $1k on what they see on the floor without researching reliability, warranty, or other features from articles and news sources?





  • As I said “if”, and I too think its unlikely, but it is possible. They may have a data agregator where the customer inputs their assets and liabilities, or it may be able to compare against only the bank’s products. So if you have a single bank that you use for your credit cards, car loan, and personal checking account, it would have all of those exposed for calculations.








  • I think we’ll have to agree to disagree. Often times if I see an interesting question in the comments, I am glad for it, because that was the insight I needed to want to read the article and answer it.

    Just reading comments without the article? I have no issue with that at all, and do that myself.

    For me that isn’t annoying unless the commenter is getting something wrong that is talked about in the article, and doubles down on it.

    How do you, as the commenter yourself, know you aren’t getting something wrong without reading the article?

    I feel like each post is an invitation to discuss the general topic

    How do you know what the general topic is without reading the article?

    If you feel like that is disrespectful, I get where you’re coming from, but I don’t think it is that disrespectful.

    Maybe disrespectful is too strong a term. Let me amend that; I lose respect for the poster when they’re asking a question that is answered in the article. I sometimes write off engaging with them further in that thread because they’re clearly not even doing the most basic of tasks to be a part of the conversation.

    But plenty of interesting conversations can happen in the comments (like this one) that have almost nothing whatever to do with the article!

    I’ll do this too on occasionally, if I can clearly tell we’re not discussion the article topic, but its a gamble on my part and if someone smacks me down because it is article topical, I fully own that and apologize knowing its my fault.