• 0 Posts
  • 43 Comments
Joined 8 months ago
cake
Cake day: February 10th, 2024

help-circle
  • My best guess: whatever they’re filing now was so exhaustively researched that it took months to prepare the strongest case they’re able to make, possibly delayed by the lawyers working on several other cases. Plus waiting until sales have dried up can maximize damages.

    Another possibility is that Nintendo/TPC is planning to make some big Pokémon announcements soon and wants to target this shortly before their own new games to reduce competition. Palworld might seem like more of a threat to the execs now that Pokémon is nearing a major release than it was in the middle of a long drought for the series.


  • USB-C video is usually DisplayPort Alt Mode, which uses a completely different data rate and protocol from USB.

    Even using old 2016 hardware, a computer and USB-C cable that both only support 5 Gbps USB (such as USB 3.1 Gen 1) can often easily transmit an uncompressed 4K 60Hz video stream over that cable, using about 15.7Gbps of DisplayPort 1.2 bandwidth. Could go far higher than that with DP 2.0.

    Some less common video-over-USB devices/docks use DisplayLink instead, which is indeed contained within USB packets and bound by the USB data rate, but it uses lossy compression so those uncompressed numbers aren’t directly comparable.


  • zarenki@lemmy.mltoTechnology@lemmy.worldSome basic info about USB
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    19 days ago

    For that portable monitor, you should just need a cable with USB-C plugs on both ends which supports USB 3.0+ (could be branded as SuperSpeed, 5Gbps, etc). Nothing more complicated than that.

    The baseline for a cable with USB-C on both ends should be PD up to 60W (3A) and data transfers at USB 2.0 (480Mbps) speeds.

    Most cables stick with that baseline because it’s enough to charge phones and most people won’t use USB-C cables for anything else. Omitting the extra capabilities lets cables be not only cheaper but also longer and thinner.

    DisplayPort support uses the same extra data pins that are needed for USB 3.0 data transfers, so in terms of cable support they should be equivalent. There also exist higher-power cables rated for 100W or 240W but there’s no way a portable monitor would need that.



  • The whole point of copyright in the first place, is to encourage creative expression, so we can have human culture and shit.

    I feel like that purpose has already been undermined by various changes to copyright law since its inception, such as DMCA and lengthening copyright term from 14 years to 95. Freedom to remix existing works is an important part of creative expression which current law stifles for any original work that releases in one person’s lifespan. (Even Disney knew this: the animated Pinocchio movie wouldn’t exist if copyright could last more than 56 years then)

    Either way, giving bots the ‘right’ to remix things that were just made less than a year ago while depriving humans the right to release anything too similar to a 94 year old work seems ridiculous on both ends.


  • a variable-length integer encoding that somewhat resembles what they do in UTF-8. It means for strings < 128 chrs, the length is a single byte. Longer than that and more bytes get used as necessary.

    What you used might be similar to unsigned LEB128, which is used in DWARF, Webassembly, Android’s DEX format, and protobuf. Essentially encodes 7 bits of the number in each byte, with the high bit being 1 in any byte except the last one representing the number.

    Though unlike UTF-8 the number’s length isn’t encoded in the first byte but instead implied by the final byte. Arguably making the number’s encoding similar to a terminated string.


  • zarenki@lemmy.mlto196@lemmy.blahaj.zoneRulekemon
    link
    fedilink
    arrow-up
    11
    ·
    1 month ago

    The baby god event was never officially released, so this actually didn’t canonically happen.

    It was released. The Azure Flute and the event where you meet and battle Arceus in the Hall of Origin in DPPt was indeed never released, but this is different.

    Arceus had various distributions in 2009-2010; the US one was at Toys R Us for example. Trading that legit Arceus to HGSS and then bringing it to the Ruins of Alph triggers this event which takes you to a special location where you can choose one egg of either Dialga, Palkia, or Giratina.



  • The conditions that processors run under in situations like military equipment are drastically different from those of consumer devices. Consistency and stability are more important than performance in those contexts. So much so that RTOS systems like VxWorks are popular in that space. They’d probably already have features like clock boost disabled (or use processors completely lacking it) in favor of a lower fixed clock speed, probably avoiding these issues entirely.


  • Legitimately playing 4K blu-ray video on a PC without cracking the DRM requires an insane combination of requirements:

    • Windows 10 (not 11)
    • An Intel processor between gen 7-10 (nothing newer because Intel ditched SGX in 2021)
    • Intel integrated graphics (no nvidia/amd)
    • Monitor that supports HDCP 2.2 for DRM (some 4k ones don’t)
    • An approved optical drive
    • Proprietary playback software which costs about $100 USD, separate from the cost of hardware and Windows
    • Miscellaneous other requirements for the motherboard features, bios settings, etc.

    Meanwhile MakeMKV can rip them on basically any Windows/Linux/Mac system with a compatible BDXL drive.


  • This board has the StarFive JH7110 SoC. That processor has previously been in very low power single board computers like StarFive VisionFive 2 (2022) and Milk-V Mars (2023), a Raspberry Pi clone that can be bought for as low as $40. Its storage limitations (SD/eMMC rather than NVMe) show how much this isn’t meant for laptop use.

    Very underpowered for a laptop too, even when considering this is intended for developers and doesn’t need to be remotely performance competitive. Consider that this has just 4 RV64GC cores, the cheapest Intel board options Framework offers are 12 cores (4P+8E), and any modern RISC-V core is far simpler with less area than even an Intel E core. These cores also lack the RISC-V vector instructions extension.


  • zarenki@lemmy.mlto196@lemmy.blahaj.zone📄 rule
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    and it’s free

    This is very uncommon in the US. Most major banks (I’m not aware of any exceptions) charge a fee for each outgoing wire transfer, usually $25-$30. Bank of America, Wells Fargo, Chase, and PNC for just a few examples I’m aware of, plus every credit union that has local branches in my area. Some of those banks even add a second fee at the recipient’s side for incoming wire transfer.

    They often encourage customers to rely on third party services like Zelle instead for small transfers to friends and family. Many banks’ sites/apps can also handle transfers between two accounts that both belong to the same bank for free too.


  • There actually is a silent and very subtle revision they made to the dpad less than a year after Switch launched (iirc around the time of the Xenoblade 2 controller). The protrusion out the middle of the underside was made slightly longer, which makes you less able to push down all four directions at the same time. The chance of undesired diagonals is very high either way but just slightly lower for the newer one.

    Even the improved version is one of Nintendo’s worst dpads ever (including handhelds) imo, with only the GameCube’s coming remotely close. A huge step back from Wii U Pro’s dpad.



  • A ground-up overhaul of the copyright system would make things so much worse, not better, considering the current climate of power. In the US for example, MPA, RIAA, Entertainment Software Association, Association of American Publishers, and others wouldn’t want public libraries or the used market to exist at all; they would push for making every single transfer of “ownership” on any media involve a payment to the rights holder. Lawmakers are far more likely to accommodate those groups’ desires than the public good.

    The worst parts of the current copyright system are the most recent. Both the DMCA and the extension of US copyright term to 95 years took effect in 1998, and the early 2000s saw many other countries passing laws to make their copyright system closer to US’s in various ways such as the WIPO Copyright Treaty which took effect in 2002 and EU’s 2006 Copyright Directive. Just about the only positive news we’ve seen in US copyright law since then is in temporary exemptions to DMCA’s anti-circumvention rules (Section 1201) which change every year. Copyright law was far less hostile to consumers and the public before the 90s than it is now, and up until 1976 it used to be expected that most media someone consumes would enter public domain within their lifetime.

    The digital era makes market relevance far more ephemeral than ever and yet the laws written for the digital era moved copyright in the opposite direction. Movie studios simultaneously judge whether a film succeeded almost exclusively based on its first week of ticket sales and also claim that depriving public domain for 95 years is necessary. Nothing should be able to justify more than 20 years of copyright. Media formats don’t even last as long as copyright; CDs and DVDs rot, game cartridges die, servers shut down, and even books printed on today’s low-quality paper will fall apart.

    Some of it is absurd to me, like the way something can be online but geographically restricted.

    This is a consequence of contract terms moreso than copyright. One issue in copyright law that this does connect to, though, is the fact that the question of whether the rightsholder keeps a work reasonably available on the market does not impact whether the work retains copyright protections. If copyright law did hypothetically include that limitation, providers would become far more likely to make sure that all content is available in all countries, but even then things could still vary in terms of which content is on which platform.


  • For years I’ve been using KeepassXC on desktop and Keepass2Android on mobile. Rather than sync the kdbx file between my devices, I have each device access it through the network. Either via sftp, smb, or nfs, but regardless I need to connect to my home’s VPN to access it when away from home since I don’t directly expose those things to the outside world.

    I used to also keep a second copy of the website-tied passwords in Firefox Sync, but recently tried migrating that to Proton Pass because I thought the PIN feature might help, then ultimately decided to move away from that too and start using the KeepassXC-Browser plugin instead. I considered Bitwarden too but haven’t tried it out yet, was somewhat deterred by seeing people say its UI seems very outdated.


  • Yes.

    My home server has dropbear-initramfs installed so that after reboot I can access the LUKS decryption prompt over SSH. The one LUKS partition contains a btrfs filesystem with both rootfs and home as subvolumes. For all the other drives attached to that system, I use ZFS native encryption with a dataset that decrypts with a keyfile from that rootfs and I have backups of an encrypted copy of that keyfile.

    I don’t think there’s a substantial performance impact but I’ve never bothered benchmarking.



  • I’m not sure if this is required. Any decent e-mail server uses TLS to communicate these days, so everything in transit is already encrypted.

    In transit, yes, but not end-to-end.

    One feature that Proton advertises: when you send an email from one Proton mail account to another Proton address, the message is automatically encrypted such that (assuming you trust their client-side code for webmail/bridge) Proton’s servers never have access to the message contents for even a moment. When incoming mail hits Proton’s SMTP server, Proton technically could (but claims not to) log the unencrypted message contents before encrypting it with the recipient’s public key and storing it. That undermines Proton’s promise of Proton not having access to your emails. If both parties involved in an email conversation agree to use PGP encryption then they could avoid that risk, and no mail server on either end would have access to anything more than metadata and the initial exchange of public keys, but most humans won’t bother doing that key exchange and almost no automated mailers would.

    Some standard way of automatically asking a mail server “Does user@proton.me have a PGP public key?” would help on this front as long as the server doesn’t reject senders who ignore this feature and send SMTP/TLS as normal without PGP. This still requires trusting that the server doesn’t give an incorrect public key but any suspicious behavior on this front would be very noticeable in a way that server-side logging would not be. Users who deem that unacceptable can still use a separate set of PGP keys.


  • They say the reason for needing their bridge is the encryption at rest, but I feel like the better way to handle wanting to push email privacy forward would be to publish (or better yet coordinate with other groups on drafting) a public standard that both clients and competing email servers could adopt for an email syncing protocol for that sort of zero-access encryption that requires users give their client a key file. A bridge would be easier to swallow as a fallback option until there’s wider client support rather than as the only way.

    A similar standard for server-to-server communication, like for automatic pgp key negotiation, would be nice too.

    Still, Proton has a easy to access data export that doesn’t require a bridge client or subscription or anything. I think that’s required by GDPR. It’s manual enough to not be an effective way to keep up-to-date backups in case you ever abruptly lose access but it’s good enough to handle wanting to migrate to another provider.