FYI BitLocker is on by default in Windows 11. The defaults will also upload the BitLocker key to a Microsoft Account if available.
This is why the FBI can compel Microsoft to provide the keys. It's possible, perhaps even likely, that the suspect didn't even know they had an encrypted laptop. Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
Except the steps to to that are disable bitlocker, create a local user account (assuming you initially signed in with a Microsoft account because Ms now forces it on you for home editions of windows), delete your existing keys from OneDrive, then re-encrypt using your local account and make sure not to sign into your Microsoft account or link it to Windows again.
A much more sensible default would be to give the user a choice right from the beginning much like how Apple does it. When you go through set up assistant on mac, it doesn't assume you are an idiot and literally asks you up front "Do you want to store your recovery key in iCloud or not?"
> make sure not to sign into your Microsoft account or link it to Windows again
That's not so easy. Microsoft tries really hard to get you to use a Microsoft account. For example, logging into MS Teams will automatically link your local account with the Microsoft account, thus starting the automatic upload of all kinds of stuff unrelated to MS Teams.
In the past I also had Edge importing Firefox data (including stored passwords) without me agreeing to do so, and then uploading those into the Cloud.
Nowadays you just need to assume that all data on Windows computers is available to Microsoft; even if you temporarily find a way to keep your data out of their hands, an update will certainly change that.
Yes, they push the MS account stuff very hard. I've found Windows so actively hostile to the user that I basically only use Linux now.
I used to be a windows user, it has really devolved to the point where it's easier for me to use Linux (though I'm technical). I really feel for the people who aren't technical and are forced to endure the crap that windows pushes on users now.
That’s the real problem MS has. It’s becoming a meme how bad the relationship between the user and windows is. It’s going to cause generational damage to their company just so they can put ads in the start menu.
I switched from Windows to Mac 15 years ago. It was a revelation when the terrible habits of verbally abusing my computer and anxiety saving files every 22 seconds just evaporated.
Those old habits have been creeping back lately through all the various *OS 26 updates. I too now have Linux on Framework. Not perfect, but so much better for my wellbeing.
I bought and returned an AMD Framework. I knew what I was getting into, but the build quality + firmware quality were lacking, sleep was bad and I'm not new to fixing Linux sleep issues. Take a look at the Linux related support threads on their forum.
I've been using AMD EliteBooks, the firmware has Linux happy paths, the hardware is supported by the kernel and Modern Standby actually works well. Getting one with a QHD to UHD screen is mandatory, though, and I wouldn't buy a brand new model without confirming it has working hardware on linux-hardware.org.
If you look online, HP has a YouTube channel with instructional videos for replacing and repairing every part of their laptops. They are made to make memory, storage and WiFi/5G card replacements easy, parts are cheap and the after market for them is healthy.
I've also had good luck with their support, they literally overnight'd a new laptop with a return box for the broken one in a day.
We have Elitebooks at work and can confirm that the 8x0 series, at least until G8, has superb Linux support out of the box (and I run Arch, by the way). IME it's actually better than Windows, since both my AMD and Intel models have had things not working on Windows (the AMD still often hangs during sleep).
> Getting one with a QHD to UHD screen is mandatory
But I have to ask: are those screens actually any good? Ours have FHD panels, and I have not seen a single one with a decent screen.
There are roughly two categories: either the el-cheapo screens, with washed-out colors (6 bpp panels on a 1500 EUR laptop!) and dimmer than the moonlight through closed shades, but they have usable angles; or the "sure view" version with very bright backlight, usable outside (not in direct sunlight, of course) with, on paper, ok colors (specs say 100% sRGB) but laughably bad viewing angles (with the sureview off, of course) and, in practice, questionable color fidelity.
These are also fairly expensive, around 1500 EUR, and the components are of questionable quality. The SSDs in particular are dog-slow (but they're very easy to replace).
I have two 5-year-old 840 G8s (one Intel, one AMD), and they have both held up fine, but I usually don't abuse my laptops (my 2013 MBP still looks brand new aside from some scratches). However, looking around at my colleagues' laptops, they tend to fall apart, and I can count on one hand the ones still in good shape. The usual suspects seem to be the barrel power connector and the keyboard. Newer models only have USB-C AFAIK (mine have both, but came with a USB-C power adapter in the box). But they tend to look pretty bad in general, with very misaligned panels and fragile USB ports.
Lenovo T and X series are excellent and cheap as dirt used. There is also System 76. Or you could get a MacBook and boot Linux on that. Some older ones work well, I hear.
> Or you could get a MacBook and boot Linux on that. Some older ones work well, I hear.
Is linux support on the M1/M2 models as good as linux support on x86 laptops? My understanding was that there's still a fair bit of hardware that isn't fully supported. Like, external displays and Bluetooth.
I use an old Lenovo AIO PC to dual boot Linux Mint and Windows 10. It works well from a hardware and firmware perspective, but I've deliberately avoided Windows 11 as it is crapware.
I have done triple booting of MacOS, Linux and Windows on an old Mac Mini, and it was a nightmare to get them working, but worked well once set up.
I think well known brands and models of PCs are better for such alternative setups, rather than obscure PCs.
They don't. I don't know what they're talking about, but I've had fewer problems with linux on my framework than weird stuff on my OSX work machine. And I'm running Alpine on my framework, so if anything should be wonky it's this one.
I've used Dell Inspiron laptops in the past, never had a problem. WiFi, multimonitor output, bluetooth, etc all work out of the box with Debian or Ubuntu.
I've had very few issues with Lenovo and Toshiba. They're generally somewhat repairable. EliteBook and Z Book from HP seems fine for Linux too, but I've never had to fiddle with hardware except that I once removed a battery from an EliteBook.
It’s funny because I started with Windows 3.1 and it was actively user hostile then. From 3.1 to XP it was awful. Then it got slightly better with 7, and went downhill from there.
Realistically, a major Linux distro is the most user-beneficial thing you can do and today it is easier than ever. If my 12 year old can figure out how to use it productively, so can anyone. Switch today and enjoy.
You just have to look at who buys Windows to understand this. It's OEM's and enterprises. Almost nobody buys an individual license. That's why they don't care. As an individual you get what your employer or hardware supplier says, like it or lump it.
Maoboro cigarettes uaed to be for women, including red tipped filters to hide lipstick marks. Sales waned, so they actually rebranded the cigarette for men, and even succeeded in making it a definition of manliness.
Advertising stories like that, make sure M$ execs could care less about damage to their image.
Linux is so much better than it used to be. You really don't need to be technical.
I have been recommending Kubuntu to Windows people. I find it's an easier bet than Linux Mint. You get the stability of Ubuntu, plus the guarantee of a Windows-like environment.
Yes, I know, Linux Mint supports Plasma, but I honestly think the "choose your desktop" part of the setup process is more confusing to a newbie than just recommending a distro with the most Windows-like UI and a straightforward installation.
Generally I recommend people use PopOS. It's well suited for laptops, as that's what System76 is focused on a they're shipping laptops with Nvidia GPUs. I personally prefer Arch based distorts like endeavor but even with wide community support it's just more likely a noob will face an error. Fwiw I've only faced one meaningful error in the last 3 years in endeavor but I've also been daily driving Linux for 15 years now
I’ve been using PopOS for the last five years and while I generally agree… the latest release using Cosmic by default has a lot to be desired. Cosmic will eventually be good but right now it’s far from it and I had to install Gnome as a stop gap just to have a functional desktop environment. I’ll probably ditch PopOS for Arch + KDE but I haven’t had the time to do so yet for my workstation.
Truly, and to really drive it home, I’ve loved PopOS but this latest release is just too half baked. I think anyone considering it should either wait a year or use something else, and Kubuntu seems like a reasonable alternative for people coming from Windows or MacOS.
I'd give kde a shot. It's been my preferred DE for years. But check out the below wiki and poke around for what your style is. The beauty of linux is adapting to you and switching DEs is a quick change (you do not need to change your DM to change your DE).
If you're interested on Arch then give something like EndeavourOS a shot. Cachy is getting popular these days too but I haven't used it. But I feel its going to be as easy as using Endeavour or Manjaro and those are very convenient distros for Arch with direct Nvidia GPU support. Though if you want you learn Linux I suggest going Vanilla Arch. You'll learn a lot from the install process (it isn't uncommon to mess up. You won't brick anything and learning about the chroot environment will help you in the future of you do mess things up)
Eh, not for laptops - I say as someone who switched to Linux from windows in past year.
I have spent a decent few days to get long battery life on Linux (fedora), with sleep hibernate + encryption. And I am still thinking that the Linux scheduler is not correctly using Intel's pcore/ecore on 13th gen correctly.
I just got a lunar lake laptop and in CachyOS you can just enable either scx_lavd or scx_bpfland from the kernel settings. I use them both: bpfland guarantees that the active application runs smoothly even if you compile code in the background, and lavd focuses on energy saving a bit more. They both understand how to use the P and E cores: especially the lavd scheduler puts the active app to a P core and all the background apps to the E cores.
If you have an Nvidia GPU you're generally going to need to edit the systemd services and change some kernel settings. This is a real pain point to be honest and it should be easier than it is (usually not too bad tbh)
If you want I can try to help you debug it. I don't have a fedora system but I can spin up a VM or nspawn to try to match your environment if you want
Do we have confirmation that it’s a must to upload the key if you use an MS account with Windows? Is it proven that it's not possible to configure Windows to have an MS account linked, maybe even to use OneDrive, while not uploading the BitLocker key?
Btw - my definition of “possible” would include anything possible in the UI - but if you have to edit the registry or do shenanigans in the filesystem to disable the upload from happening, I would admit that it’s basically mandatory.
I just checked on my personal desktop, which has Windows 11 installed using a local user account and is signed into my MS account for OneDrive and my account is listed as having no recovery codes in the cloud. I don’t recall editing anything in the registry to accomplish this it was the default behavior for having a local user account. I copied my recovery codes when I built the machine and pasted them into an E2EE iPhone note which should allow me to recover my machine if disaster strikes (also everything is backed up to Backblaze using their client side encryption).
>Nowadays you just need to assume that all data on Windows computers is available to Microsoft; even if you temporarily find a way to keep your data out of their hands, an update will certainly change that.
I get why the US would not, but I really wish the rest of the world looked at this like the security and sovereignty issue that it is.
Or: Put all of Windows inside of a VM, within a host that uses disk encryption -- and let it run amok inside of its sandbox.
I did this myself for about 8 years, from 2016-2024. During that time my desktop system at home was running Linux with ZFS and libvirt, with Windows in a VM. That Windows VM was my usual day-to-day interface for the entire system. It was rocky at first, but things did get substantially better as time moved on. I'll do it again if I have a compelling reason to.
With a VM running on an encrypted file system, whatever a warrant for a bitlocker key might normally provide will be hidden behind an additional layer that Microsoft does not hold the keys to.
(Determining whether that is useful or not is an exercise for the person who believes that they have something to hide.)
It's not just Teams. You need to be constantly vigilant not to make any change that would let them link your MS account to Windows. And they make it more and more difficult not only to install but also use Windows without a Microsoft account. I think they'll also enforce it on everybody eventually.
You need to just stop using windows and that's it.
The only windows I am using is the one my company makes me use but I don't do anything personal on it. I have my personal computer next to it in my office running on linux.
doing things like that which is completely unrelated should be considered data theft, and microsoft should be punished so severely they wish they never had the idea to begin with
In the startup world, BYOD is/was exceedingly common. All but two jobs of my career were happy to allow me to use my own Linux laptop and eschew whatever they were otherwise going to give me.
Obviously enterprises aren’t commonly BYOD shops, but SMBs and startups certainly can be.
… whether the people who would do such BYOD things are at all likely to be Windows users who care about this Bitlocker issue, is a different debate entirely.
I know BYOD was common (although getting a fully specced MacBook Pro was often one of the “perks”), but typically you did get (some) budget or reimbursement for using your own device. So in a sense the company was paying for your device which allows you to buy a dedicated machine.
I also notice that it helps in segmenting in the brain to use separate devices for private and business use.
I’ve been diving down the BYOD rabbit hole recently. At enterprise scale it’s not “hook in with your vpn, job done”, it’s got to be managed. Remote wipe on exit, prove the security settings, disk encryption, EDR.
What this means for the user is your personal device is rather invasively managed. If you want Linux, your distro choice may be heavily restricted. What you can do with that personal device might be restricted (all the EDR monitoring), and you’ll probably take a performance and reliability hit. Not better than just a second laptop for most people.
teams works fine in website form for me because it IS a website (that uses an extra ~1gb of ram running as a desktop app because its also a separate browser)
That's actually a misunderstanding that blew up to an outright lie:
The Start Menu is fully native. The "Recommended" section (and only it) is powered by a React Native backend, but the frame & controls are native XAML. (I.e. there's a JS runtime but no renderer)
All "Global Reader" accounts have "microsoft.directory/bitlockerKeys/key/read" permission.
Whether you opt in, or not, if you connect your account to Microsoft, then they do have the ability fetch the bitlocker key, if the account is not local only. [0] Global Reader is builtin to everything +365.
That's for Entra/AD, aka a workplace domain. Personal accounts are completely separate from this. (Microsoft don't have a AD relationship with your account; if anything, personal MS accounts reside in their own empty Entra forest)
What do Entra role permissions have to do with Microsoft's ability to turn over data in its possession to law enforcement in response to a court order?
> Because hypotheticals that they could are not useful.
Why? They are useful to me and I appreciate the hypotheticals because it highlights the gaps between "they can access my data and I trust them to do the right thing" and "they literally can't access my data so trust doesn't matter."
Considering all the shenanigans Microsoft has been up to with windows 11 and various privacy, advertising, etc. stuff?
Hell, all the times they keep enabling one drive despite it being really clear I don’t want it, and then uploading stuff to the cloud that I don’t want?
I have zero trust for Microsoft now, and not much better for them in the past either.
This 100% happens, they’ve done it to at least one of my clients in pretty explicit violations of HIPAA (they are a very small health insurance broker), even though OneDrive had never been engaged with, and indeed we had previously uninstalled OneDrive entirely.
One day they came in and found an icon on their desktop labeled “Where are my files?” that explained they had all been moved in OneDrive following an update. This prompted my clients to go into full meltdown mode, as they knew exactly what this meant. We ultimately got a BAA from
Microsoft just because we don’t trust them not to violate federal laws again.
> MS doesn't have a magic way to reach into your laptop and pluck the keys.
Of course they do! They can just create a Windows Update that does it. They have full administrative access to every single PC running Windows in this way.
It's largely the same for all automatic updating systems that don't protect against personalized updates.
I don't know the status of the updating systems of the various distributions; if some use server-delivered scripts run as root, that's potentially a further powerful attack avenue.
But I was assuming that the update process itself is safe; the problem is that you usually don't have guarantees that the updates you get are genuine.
So if you update a component run as root, yes, the update could include malicious code that can do anything.
But even an update to a very constrained application could be very damaging: for example, if it is for a E2EE messaging application, it could modify it to have it send each encryption key to a law enforcement agency.
> the problem is that you usually don't have guarantees that the updates you get are genuine
A point of order: you do have that guarantee for most Linux distro packages. All 70,000 of them in Debian's case. And all Linux distro distribute their packages anonymously, so they can never target just one individual.
That's primarily because they aren't trying to make money out of you. Making money requires a billing relationship, and tracking which of your customers own what. Off the back of that governments can demand particular users are targeted with "special" updates. Australia in particular demands commercial providers do that with its "Assistance and Access Bill (2018)" and I'm sure most governments in the OECD have equivalents.
Not really, but it's quite complex for Linux because there are so many ways one can manage the configuration of a Linux environment. For something high security, I'd recommend something like Gentoo or NixOS because they have several huge advantages:
- They're easy to setup and maintain immutable and reproducible builds.
- You only install the software you need, and even within each software item, you only build/install the specific features you need. For example, if you are building a server that will sit in a datacentre, you don't need to build software with Bluetooth support, and by extension, you won't need to install Bluetooth utilities and libraries.
- Both have a monolithic Git repository for packages, which is advantageous because you gain the benefit of a giant distributed Merkle tree for verifying you have the same packages everyone else has. As observed with xz-utils, you want a supply chain attacker to be forced to infect as many people as possible so more people are likely to detect it.
- Sandboxing is used to minimise the lines of code during build/install which need to have any sort of privileges. Most packages are built and configured as "nobody" in an isolated sandbox, then a privileged process outside of the sandbox peeks inside to copy out whatever the package ended up installing. Obviously the outside process also performs checks such as preventing cool-new-free-game from overwriting /usr/bin/sudo.
- The time between a patch hitting an upstream repository and that patch being part of a package installed in these distributions is fast. This is important at the moment because there are many efforts underway to replace and rewrite old insecure software with modern secure equivalents, so you want to be using software with a modern design, not just 5 year old long-term-support software. E.g. glycin is a relatively new library used by GNOME applications for loading of untrusted images. You don't want to be waiting 3 years for a new long-support-support release of your distribution for this software.
No matter which distribution you use, you'll get some common benefits such as:
- Ability to deploy user applications using something like Flatpak which ensures they are used within a sandbox.
- Ability to deploy system applications using something like systemd which ensures they are used within a sandbox.
Microsoft have long underinvested in Windows (particularly the kernel), and have made numerous poor and failed attempts to introduce secure application packaging/sandboxing over the years. Windows is now akin to the horse and buggy when compared to the flying cars of open source Linux, iOS, Android and HarmonyOS (v5+ in particular which uses the HongMeng kernel that is even EAL6+, ASIL D and SIL 3 rated).
Furthermore it seems like it's specific to Azure AD, and I'm guessing it probably only has effect if you enable to option to back up the keys to AD in the first place, which is not mandatory
I'd be curious to see a conclusive piece of documentation about this, though
Regular AD also has this feature, you can store the encryption keys in the domain controller. I don't think it's turned on by default, but you can do that with a group policy update.
Note that password-based Bitlocker requires Windows Pro which is quite a bit more expensive.
> sign into your Microsoft account or link it to Windows again.
For reference, I did accidentally login into my Microsoft account once on my local account (registered in the online accounts panel). While Edge automatically enabled synchronization without any form of consent from my part, it does not look like that my Bitlocker recovery key is listed on https://account.microsoft.com/devices/recoverykey. But since I unlinked my account, it could be that it was removed automatically (but possible still cached somewhere).
Not anymore, modern hardware running Windows 11 Home now also has FDE, technically running on BitLocker, just that it's called "Device Encryption" and doesn't have the same options:
> For reference, I did accidentally login into my Microsoft account once on my local account (registered in the online accounts panel)
Those don't usually count as the "primary" MS account and don't convert a local account. For example, you can have a multiple of those, and generally they're useful to save repeated signins or installing stuff from the Microsoft Store that require a personal account.
> Note that password-based Bitlocker requires Windows Pro which is quite a bit more expensive.
Given that:
1. Retail licenses (instead of OEM ones) can be transferred to new machines
2. Microsoft seems to be making a pattern of allowing retail and OEM licenses to newer versions of Windows for free
A $60 difference in license cost, one-time, isn't such a big deal unless you're planning on selling your entire PC down the line and including the license with it. Hell, at this point, I haven't purchased a Windows license for my gaming PC since 2013 - I'm still using the same activation key from my retail copy of Windows 8 Pro.
This amounts to a difference of 114€ or 135$ at the current exchange rate which is significantly more. Also surprised that Windows Pro is 189% of the price of the Home edition in France but 143% in the USA.
I initially bought the Home edition but could not upgrade to pro without buying a full license so I had to bear the full cost of the French Pro license, which lead to an upgrade cost of 259€ instead of just $60. (basically I had to buy the pro version to get password unlock with Bitlocker since TPM unlock was broken with dual boot, needed to enter the recovery key after every boot to Fedora). If it was possible to only pay for the difference they did not make it obvious.
And in general paying this much for an OS that still pushes dark pattern and ads onto me leaves quite a bad taste in my mouth; I wouldn't mind paying a subscription if I could get an OS that does what I want and gets fully out of my way. (but I guess subscription would come with mandatory online accounts which is part of the problem at hand here).
You can turn it off without resorting to a local account, although it's non-obvious.
GPEdit -> Computer Configuration → Administrative Templates → Windows Components → BitLocker Drive Encryption → Operating System Drives → “Choose how BitLocker-protected operating system drives can be recovered”
No, the actual data encryption key doesn't need to change unless you're very paranoid. The backup key and your normal key is just to decrypt the data encryption key.
Exactly. I question why the parent says you have to re-encrypt the drive.
Microsoft has the KEK or passphrase that can be used to derive the KEK. The KEK protects the DEK which is used to encrypt the data. Rotating the KEK (or KEKs if multiple slots are used) will overwrite the encrypted DEK, rendering the old KEK useless.
Or does BitLocker work differently than typical data at rest encryption?
BitLocker recovery keys are essentially the key to an at-rest, local copy of the real key. (I.e., they need access to the encrypted drive to get the real encryption key)
When you use a recovery key at preboot, it decrypts that on-disk backup copy of the encryption key with your numerical recovery key, and uses the decrypted form as the actual disk encryption key. Thus, you can delete & regenerate a recovery key, or even create several different recovery keys.
Only because others you communicate with may not have ADP turned on, which is a flaw with any service that you cannot control what the other end does or does not do, not unique to Apple/iMessage outside of using something like Signal.
Most other E2EE messaging services do not break their own E2EE by intentionally uploading messages or encryption keys to servers owned by the same company in a form that they can read. For example, Google's Messages app does not do this for E2EE conversations. This isn't something that only Signal cares about.
Does using the "manage-bde -protectors -add" command to add a device key encrypted by a local recovery key, followed by the "manage-bde -protectors -delete" command to delete the device key encrypted by the uploaded key not work?
They could have taken a more defence-in-depth approach to key storage and encrypted the cloud copy of the Bitlocker key with a random master key itself protected by a user password-derived key arrangement, with any crypto action occuring on the device to avoid knowledge of the plaintext key. That way the Bitlocker key stored in the cloud is opaque to Microsoft, and only by knowing the user's current cleartext password could they access the raw Bitlocker key.
The current approach is weak, and strikes me as a design unlikely to be taken unless all the people involved were unfamiliar with secure design (unlikely IMO), or they intentionally left the door open to this type of access.
>Except the steps to to that are disable bitlocker, create a local user account (assuming you initially signed in with a Microsoft account because Ms now forces it on you for home editions of windows), delete your existing keys from OneDrive, then re-encrypt using your local account and make sure not to sign into your Microsoft account or link it to Windows again.
1. Is there any indication it forcibly uploads your recovery keys to microsoft if you're signed into a microsoft account? Looking at random screenshots, it looks like it presents you an option https://helpdeskgeek.com/wp-content/pictures/2022/12/how-to-...
2. I'm pretty sure you don't have to decrypt and rencrypt the entire drive. The actual key used for encrypting data is never revealed, even if you print or save a recovery key. Instead, it generates a "protectors", which encrypts the actual key using the recovery key, then stores the encrypted version on the drive. If you remove a recovery method (ie. protector), the associated recovery key becomes immediately useless. Therefore if your recovery keys were backed up to microsoft and you want to opt out, all you have to do is remove the protector.
Again, that is a lot of trust since it could trivially just… not show it. Which is already the default for most FDE systems for intermediate/system managed keys.
It could also just pretend to encrypt your drive with a null key and not do anything, either.
You need some implicit trust in a system to use it. And at worst, you can probably reverse engineer the (unencrypted) BitLocker metadata that preboot authentication reads.
You can encrypt a Bitlocker volume without syncing your keys even if you do log in with a Microsoft account, at least last time I was configuring Bitlocker.
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
Once the feature exists, it's much easier to use it by accident. A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded. And it's a silent failure: the security properties of the system have changed without any visible indication that it happened.
There's a lot of sibling comments to mine here that are reading this literally, but instead, I would suggest the following reading: "I never selected that option!" "Huh, must have been a cosmic ray that uploaded your keys ;) Modern OS updates never obliterate user-chosen configurations"
This is correct, I also discovered while preparing several ThinkPads for a customer based on a Windows 11 image i made, that even if you have bitlocker disabled you may also need to check that hardware disk encryption is disabled as well (was enabled by default in my case). Although this is different from bitlocker in that the encryption key is stored in the TPM, it is something to be aware of as it may be unexpected.
If users are so paranoid that they worry about a cosmic ray bit flipping their computer into betraying them, they're probably not using a Microsoft account at all with their Windows PC.
If your security requirements are such that you need to worry about legally-issued search warrants, you should not connect your computer to the internet. Especially if it's running Windows.
In all political environments everyone should be worried about that. The social temperature can change rapidly and you generally can't force a third party to destroy copies of your things in a reliable manner.
Right, this is just a variation on "If you have nothing to hide..."
ETA: You're not wrong; folk who have specific, legitimate opsec concerns shouldn't be using certain tools. I just initially read your post a certain way. Apologies if it feels like I put words in your mouth.
I saw a computer with 'system33', 'system34' folders personally. Also you would never actually know it happened because... it's not ECC. And with ECC memory we replace a RAM stick every two-three months explicitly because ECC error count is too high.
Rounding that to 1 error per 30 days per 256M, for 16G of RAM that would translate to 1 error roughly every half a day. I do not believe that at all, having done memory testing runs for much longer on much larger amounts of RAM. I've seen the error counters on servers with ECC RAM, which remain at 0 for many months; and when they start increasing, it's because something is failing and needs replaced. In my experience RAM failures are much rarer than for HDDs and SSDs.
Given enough computers, anything will happen. Apparently enough bit flips happen in domains (or their DNS resolution) that registering domains one bit away from the most popular ones (e.g. something like gnogle.com for google.com) might be worth it for bad actors. There was a story a few years ago, but I can't find it right now; perhaps someone will link it.
A very old game speedrun -- of the era that speedruns weren't really a "thing" like they are today -- apparently greatly benefited from a hardware bit flip, and it was only recently discovered.
The Tick Tock Clock upwarp in Super Mario 64. All evidence that exists of it happening is a video recording. The most similar recording was generated by flipping a single bit in Mario's Y position, compared to other possibilities that were tested, such as warping Mario up to the closest ceiling directly above him.
I'm pretty sure that while no one knows the cause definitively, many people agreed that the far more likely explanation for the bit change was a hardware fault (memory error, bad cartridge connection or something similar) or other, more powerful sources of interference. The player that recorded the upwarp had stated that they often needed to tilt the cartridge to get the game to run, showing that the connection had already degraded. The odds of it being caused by a cosmic ray single-event upset seem to be vanishingly low, especially since similar (but not identical) errors have already been recorded on the N64.
At the time Google was taking RAM that had failed manufacturer QA that they had gotten for cheap and sticking it on DIMMs themselves and trying to self certify them.
You are right. Apologies for spreading false information(
"We provide strong evidence that memory errors are dominated by hard errors, rather than soft errors, which previous work suspects to be the dominant error mode." [0]
"Memory errors can be caused by electrical or magnetic interference (e.g. due to cosmic rays), can be due to problems with the hardware (e.g. a bit being
permanently damaged), or can be the result of corruption along the data path between the memories and the processing elements. Memory errors can be classified into soft errors, which randomly corrupt bits but do not leave physical damage; and hard errors, which corrupt bits in a repeatable manner because of a physical defect."
"Conclusion 7: Error rates are unlikely to be dominated
by soft errors.
We observe that CE [correctable errors] rates are highly correlated with system utilization, even when isolating utilization effects from the effects of temperature. In systems that do not use memory scrubbers this observation might simply reflect a higher detection rate of errors. In systems with memory scrubbers, this observations leads us to the conclusion that a significant fraction of errors is likely due to mechanism other than soft errors, such as hard errors or errors induced on the datapath. The reason is that in systems with memory scrubbers the reported rate of soft errors should not depend on utilization levels in the system. Each soft error will eventually be detected (either when the bit is accessed by an application or by the scrubber), corrected and reported. Another observation that supports Conclusion 7 is the strong correlation between errors in the same DIMM. Events that cause soft errors, such as cosmic radiation, are expected to happen randomly over time and not in correlation.
Conclusion 7 is an interesting observation, since much previous work has assumed that soft errors are the dominating error mode in DRAM. Some earlier work estimates hard errors to be orders of magnitude less common than soft errors and to make up about 2% of all errors."
Happens all the time, in reality (even on the darkside). When the atmosphere fails (again, happening all the time), error correction usually handles the errant bits.
In the 2010 era of RAM density, random bit flips were really uncommon. I worked with over a thousand systems which would report ECC errors when they happen and the only memorable events at all were actual DIMM failures.
Also, around 1999-2000, Sun blamed cosmic rays for bit flips for random crashes with their UltraSPARC II CPU modules.
Yep, hardware failures, electrical glitches, EM interference... All things that actually happen to actual people every single day in truly enormous numbers.
It ain't cosmic rays, but the consequences are still flipped bits.
>A finger slip, a bug in a Windows update, or even a cosmic ray flipping the "do not upload" bit in memory, could all lead to the key being accidentally uploaded.
This is absurd, because it's basically a generic argument about any sort of feature that vaguely reduces privacy. Sorry guys, we can't have automated backups in windows (even opt in!), because if the feature exists, a random bitflip can cause everything to be uploaded to microsoft against the user's will.
>This is absurd, because it's basically a generic argument about any sort of feature that vaguely reduces privacy. Sorry guys, we can't have automated backups in windows (even opt in!), because if the feature exists, a random bitflip can cause everything to be uploaded to microsoft against the user's will.
This is a dismissal of an objection to a software system implemented such that it performs in a discrete manner by default(no info leaves until I explicitly tell it to; this would be a nice thing, if you hadn't noticed). You repudiate the challenge on the basis of "we want to implement $system that escrows keys by default; a bad thing, but great for the company and host government in which said thing is widely adopted).
You may not have used the exact words; but the constellation of factors is still there. We can't have nice things (machines that don't narc, do what we tell them, etc.) because there are other forces at work in our society making these things an impossibility.
It is regrettable you do not see the pattern, but then again, that may be for the better for you. I wouldn't wish the experience of seeing things the way I do on anyone else. Definitely not a fun time. But it is certainly there.
We have mandatory identification for all kinds of things that are illegal to purchase or engage in under a certain age. Nobody wants to prosecute 12 year old kids for lying when the clicked the "I am at least 13 years old" checkbox when registering an account. The only alternative is to do what we do with R-rated movies, alcohol, tobacco, firearms, risky physical activities (i.e. bungee jumping liability waiver) etc... we put the onus of verifying identification on the suppliers.
I don't think that's quite right. The age-gating of the internet is part of a brand new push, it's not just patching up a hole in an existing framework. At least in my Western country, all age-verified activities were things that could've put someone in direct, obvious danger - drugs, guns, licensing for something that could be dangerous, and so on. In the past, the 'control' of things that were just information was illusory. Movie theaters have policies not to let kids see high-rated movies, but they're not strictly legally required to do so. Video game stores may be bound by agreements or policy not to sell certain games to children, but these barriers were self-imposed, not driven by law. Pornography has really been the only exception I can think of. So, demanding age verification to be able to access large swaths of the internet (in some cases including things as broad as social media, and similar) is a huge expansion on what was in the past, instead of just them closing up some loopholes.
When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.
We can't trust every private company that now has to verify age to not store that information with whatever questionable security.
If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not, then we need regulation to prevent the storage of ID information.
We should still be able to verify age while remaining psuedo-anonymous.
> If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not, then we need regulation to prevent the storage of ID information.
Querying a national registry is not good because the timing of the queries could be matched up with the timing of site logins to possibly figure out the identities of anonymous site users.
A way to address this, at the cost of requiring the user to have secure hardware such as a smart phone or a smart card or a hardware security token or similar is for your government to issue you signed identity documents that you store and that are bound cryptographically to your secure hardware.
A zero knowledge protocol can later be used between your secure hardware and the site you are trying to use that proves to the site you have ID that says you are old enough and it is bound to your hardware without revealing anything else from your ID to the site.
This is what the EU had been developing for a few years. It is currently undergoing a series of large scale field trials, with release to the public later this year, with smart phones as the initial secure hardware. Member starts will be required to support it, and any mandatory age verification laws they pass will require sites to support it (they can also support other methods).
All the specs are open and the reference implementations are also open source, so other jurisdictions could adopt this.
Google has released an open source library for a similar system. I don't know if it is compatible with the EU system or not.
I think Apple's new Digital ID feature in Wallet is also similar.
We really need to get advocacy groups that are lobbying on age verification bills to try to make it so when the bills are passed (and they will be) they at least allow sites to support some method like those described above, and ideally require sites to do so.
> If we aren't going to do a national registry that services can query to get back only a "yes or no" on whether a user is of age or not
And note that if we are, the records of the request to that database are an even bigger privacy timebomb than those of any given provider, just waiting for malicious actors with access to government records.
> When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.
Beer, sure. But if you buy certain decongestants, they do log your ID. At least that's the case in Texas.
In PA they scan your ID if you buy beer. There could be a full digital record of all my beer purchases for past 15+ years, although I'm not aware of any aggregation of this data that is happening. Not that I expect anyone doing it would talk about it.
> But if you buy certain decongestants, they do log your ID.
Yeah, but many people don't actually think War on Drugs policies are a model for civil liberties that should be extended beyond that domain (or, in many cases, even tolerated in that domain.) That policy has been effective, I guess, in promoting the sales of alternative “decongestants” (that don't actually work), though it did little to curb use and harms from the drugs it was supposed to control by attacking supply.
Depending on the gas station... I've been to at least a dozen in Texas where the clerk scanned the back of my DL for proof of age. I'm assuming that something is getting stored somewhere..
> When I go buy a beer at the gas station, all I do is show my ID to the cashier. They look at it to verify DOB and then that's it. No information is stored permanently in some database that's going to get hacked and leaked.
That's how it should be, but it's not how it is. Many places now scan your ID into their computer (the computer which, btw, tracks everything you buy). It may not go to a government database (yet) but it's most certainly being stored.
We should easily be able to, but the problem of tech illiteracy is probably our main barrier. To build such a system you’d need to issue those credentials to the end users. Those users in turn would eagerly believe conspiracy theories that the digital ID system was actually stealing their data or making it available to MORE parties instead of fewer (compared to using those ID verification services we have today).
The problem is that there is nothing done to protect privacy.
There is already plenty of entities that not only have reliable way of proving it's you that have access to account, but also enough info to return user's age without disclosing anything else, like banks or govt sites, they could (or better, be forced to) provide interface to that data.
Basically "pick your identity provider" -> "auth on their site" -> "step showing that only age will be shared" -> response with user's age and the query's unique ID that's not related to the user account id
You can always count on someone coming along and defending the multi-trillion dollar corporation that just so happens to take a screenshot of your screen every few seconds (among many, many - too many other things)
I big demographic of HN users are people who want to be the multi-trillion dollar corporation so it’s not too surprising. In this case though I think they are right. And I’m a big time Microsoft hater.
There is no point locking your laptop with a passphrase if that passphrase is thrown around.
Sure, maybe some thief can't get access, but they probably can if they can convince Microsoft to hand over the key.
Microsoft should not have the key, thats part of the whole point of FDE; nobody can access your drive except you.
The cost of this is that if you lose your key: you also lose the data.
We have trained users about this for a decade, there have been countless dialogues explaining this, even if we were dumber than we were (we're not, despite what we're being told: users just have fatigue from over stimulation due to shitty UX everywhere); then it's still a bad default.
Just to be clear: bitlocker is NOT encrypting with your login password! I could be a little fuzzy on the details but I believe how it works is that your TPM (Trusted Platform Module) is able to decrypt your laptop, but will only do so if there is a fully signed and trusted boot chain, so if somebody gains access to your laptop and attempts to boot into anything other than Windows, it will ask for the bitlocker key because the TPM won't play ball.
The important bit here is that ~*nobody* who is using Windows cares about encryption or even knows what it is! This is all on by default, which is a good thing, but also means that yes, of course Microsoft has to store the keys, because otherwise a regular user will happen to mess around with their bios one day and accidentally lock themselves permanently out of their computer.
If you want regular FDE without giving Microsoft the key you can go ahead and do it fairly easily! But realistically if the people in these cases were using Linux or something instead the police wouldn't have needed an encryption key because they would never have encrypted their laptop in the first place.
> nobody who is using Windows cares about encryption or even knows what it is!
Right, so the solution is to silently upload their encryption keys to Microsoft's servers without telling them? If users don't understand encryption, they certainly don't understand they've just handed their keys to a third party subject to government data requests.
> otherwise a regular user will happen to mess around with their bios one day and accidentally lock themselves permanently out of their computer.
This is such transparent fear-mongering. How often does this actually happen versus how often are cloud providers breached or served with legal requests? You're solving a hypothetical edge case by creating an actual security vulnerability.
Encryption by default and cloud key escrow are separate decisions. You can have one without the other. The fact that Microsoft chose both doesn't make the second one necessary, it makes it convenient for Microsoft.
> If you want regular FDE without giving Microsoft the key you can go ahead and do it fairly easily!
Then why isn't that the default with cloud backup as opt-in? Oh right, because then Microsoft wouldn't have everyone's keys.
BitLocker encrypts data on a disk using what it calls a Full Volume Encryption Key (FVEK).[1][2] This FVEK is encrypted with a separate key which it calls a Volume Management Key (VMK) and the VMK-encrypted FVEK is stored in one to three (for redundancy) metadata blocks on the disk.[1][2] The VMK is then encrypted with one or more times with a key which is derived/stored using one or more methods which are identified with VolumeKeyProtectorID.[2][3] These methods include what I think would now be the default for modern Windows installations of 3 "Numerical password" (128-bit recovery key formatted with checksums) and 4 "TPM And PIN". Previously instead of 4 "TPM And PIN" most Windows installations (without TPMs forced to be used) would probably be using just 8 "Passphrase". Unless things have changed recently, in mode 4 "TPM And PIN", the TPM stores a partial key, and the PIN supplied by the user is the other partial key, and both partial keys are combined together to produce the key used to decrypt the VMK.
Seemingly once you've installed Windows and given the Microsoft your BitLocker keys in escrow, you could then use Remove-BitLockerKeyProtector to delete the VMK which is protected with mode 3 "Numerical password" (recovery key).[4] It appears that the escrow process (possibly the same as used by BackupToAAD-BitLockerKeyProtector) might only send the numerical key, rather than the VMK itself.[5][6] I couldn't find from a quick Internet search someone who has reverse engineered fveskybackup.dll to confirm this is the case though. If Microsoft are sending the VMK _and_ the numerical key, then they have everything needed to decrypt a disk. If Microsoft are only sending the numerical key, and all numerical key protected VMKs are later securely erased from the disk, the numerical key they hold in escrow wouldn't be useful later on.
Someone did however ask the same question I first had. What if I had, for example, a billion BitLocker recovery keys I wanted to ensure were backed up for my protection, safety and peace of mind? This curious person did however already know the limit was 200 recovery keys per device, and found out re-encryption would fail if this limit had been reached, then realised Microsoft had fixed this bug by adding a mechanism to automatically delete stale recovery keys in escrow, then reverse engineered fveskybackup.dll and an undocumented Microsoft Graph API call used to delete (or "delete") escrowed BitLocker recovery keys in batches of 16.[7]
It also appears you might only be able to encrypt 10000 disks per day or change your mind on your disk's BitLocker recovery keys 10000 times per day.[8] That might sound like a lot for particularly an individual, but the API also perhaps applies a limit of 150 disks being encrypted every 15 minutes for an entire organisation/tenancy. It doesn't look like anyone has written up an investigation into the limits that might apply for personal Microsoft accounts, or if limits differ if the MS-Organization-Access certificate is presented, or what happens to a Windows installation if a limit is encountered (does it skip BitLocker and continue the installation with it disabled?).
The vast, vast majority of Windows users don't know their laptops are encrypted, don't understand encryption, and don't know what bitlocker is. If their keys weren't stored in the cloud, these users could easily lose access to their data without understanding how or why. So for these users, which again is probably >99% of all windows users, storing their keys in the cloud makes sense and is a reasonable default. Not doing it would cause far more problems than it solves.
And the passphrase they log in to windows with is not the key, Microsoft is not storing their plain text passphrase in the cloud, just to be clear.
The only thing I would really fault Microsoft for here is making it overly difficult to disable the cloud storage for users who do understand all the implications.
> The vast, vast majority of Windows users don't know their laptops are encrypted, don't understand encryption, and don't know what bitlocker is.
Mate, if 99% of users don't understand encryption, they also don't understand that Microsoft now has their keys. You can't simultaneously argue that users are too thick to manage keys but savvy enough to consent to uploading them.
> If their keys weren't stored in the cloud, these users could easily lose access to their data without understanding how or why.
As opposed to losing access when Microsoft gets breached, or when law enforcement requests their keys, or when Microsoft decides to lock them out? You've traded one risk for several others, except now users have zero control.
The solution to "users might lock themselves out" is better UX for local key backup, not "upload everyone's keys to our servers by default and bury the opt-out". One is a design problem, the other is a business decision masquerading as user protection.
> The only thing I would really fault Microsoft for here is making it overly difficult to disable the cloud storage for users who do understand all the implications.
That's not a bug, it's the entire point. If it were easy to disable, people who understand the implications would disable it. Can't have that, can we?
It only became off by default after those "daily rage sessions" created sufficient public pressure to turn them off.
Microsoft also happens to own LinkedIn which conveniently "forgets" all of my privacy settings every time I decide to review them (about once a year) and discover that they had been toggled back to the privacy-invasive value without my knowledge. This has happened several times over the years.
Daily rage is exactly what technology affine people need to direct at Microslop, while helping their loved ones and ideally businesses transition away from the vendor lockin onto free software.
AI enshittification is irrelevant here. Why is someone pointing out that sensible secure defaults are a good thing suddenly defending the entire company?
It generally is, because in the vast majority of cases users will not keep a local copy and will lose their data.
Most (though not all) users are looking for encryption to protect their data from a thief who steals their laptop and who could extract their passwords, banking info, etc. Not from the government using a warrant in a criminal investigation.
If you're one of the subset of people worried about the government, you're generally not using default options.
For laptops sure, but then those are not reasons for it to be default on desktops too. Are most Windows users on laptops? I highly doubt that. So it is not a sensible default.
> It generally is, because in the vast majority of cases users will not keep a local copy and will lose their data.
What's the equivalent of thinking users are this stupid?
I seem to recall that the banks repeatedly tell me not to share my PIN number with anyone, including (and especially) bank staff.
I'm told not to share images of my house keys on the internet, let alone handing them to the government or whathaveyou.
Yet for some unknown reason everyone should send their disk encryption keys to one of the largest companies in the world (largely outside of legal jurisdiction), because theythemselves can't be trusted.
Bear in mind that with a(ny) TPM chip, you don't need to remember anything.
Come off it mate. You're having a laugh aren't you?
> What's the equivalent of thinking users are this stupid?
What's the equivalent of thinking security aficionados are clueless?
Security advice is dumb and detached from life, and puts ubdue burden on people that's not like anything else in life.
Sharing passwords is a feature, or rather a workaround because this industry doesn't recognize the concept of temporary delegation of authority, even though it's the basics of everyday life and work. That's what you do when you e.g. send your kid on a grocery run with your credit card.
Asking users to keep their 2FA recovery keys or disk encryption keys safe on their own - that's beyond ridiculous. Nothing else in life works that way. Not your government ID, not your bank account, not your password, not even the nuclear launch codes. Everything people are used to is fixable; there's always a recovery path for losing access to accounts or data. It may take time and might involve paying a notary or a court case, but there is always a way. But not so with encryption keys to your shitposts and vacation pictures in the cloud.
Why would you expect people to follow security advice correctly? It's detached from reality, dumb, and as Bitcoin showed, even having millions of dollars on the line doesn't make regular people capable of being responsible with encryption keys.
Your credit card analogy is doing a lot of heavy lifting here, but it's carrying the wrong cargo. Sending your kid to the shops with your card is temporary delegation, not permanent key escrow to a third party you don't control. It's the difference between lending someone your house key for the weekend and posting a copy to the council "just in case you lose yours". And; you know that you've done it, you have personally weighed the risks and if something happens with your card/key in that window: you can hold them to account. (granted, keys can be copied)
> Nothing else in life works that way. Not your government ID, not your bank account, not your password, not even the nuclear launch codes.
Brilliant examples of why you're wrong:
Government IDs have recovery because the government is the trusted authority that verified you exist in the first place. Microsoft didn't issue your birth certificate.
Nuclear launch codes are literally designed around not giving any single entity complete access, hence the two-person rule and multiple independent key holders. You've just argued for my position.
Banks can reset your PIN because they're heavily regulated entities with legal obligations and actual consequences for breaching trust. Microsoft's legal department is larger than most countries' regulators.
> even having millions of dollars on the line doesn't make regular people capable of being responsible with encryption keys.
Right, so the solution is clearly to hand those keys to a corporation that's subject to government data requests, has been breached multiple times, and whose interests fundamentally don't align with yours? The problem with Bitcoin isn't that keys are hard - it's that the UX is atrocious. The solution is better tooling, not surveillance capitalism with extra steps.
You're not arguing for usability. You're arguing that we should trust a massive corporation more than we trust ourselves, whilst simultaneously claiming users are too thick to keep a recovery key in a drawer. Pick a lane.
Let's be serious for a second and consider what's more useful based on the likelihood of these things actually happening.
You're saying it's likely to happen that a laptop thief also is capable to stealing the recovery key from Microsoft'servers?
So therefore it would be better that users lost all their data if
- an update bungles the tpm trust
- their laptop dies and they extract the hard drive
- they try to install another OS alongside but fuck up the tpm trust along the way
- they have to replace a Mainboard
- they want to upgrade their pc
?
I know for a fact which has happened to me more often.
You've listed five scenarios where local recovery would help and concluded that cloud escrow is therefore necessary. The thing is every single one of those scenarios is solved by a local backup of your recovery key, not by uploading it to Microsoft's servers.
The question isn't "cloud escrow vs nothing". It's "cloud escrow vs local backup". One protects you from hardware failure. The other protects you from hardware failure whilst also making you vulnerable to data breaches, government requests, and corporate policy changes you have zero control over.
You've solved a technical problem by creating a political one. Great.
> Sending your kid to the shops with your card is temporary delegation, not permanent key escrow to a third party you don't control. It's the difference between lending someone your house key for the weekend and posting a copy to the council "just in case you lose yours".
Okay, then take sharing your PINs with your spouse. Or for that matter, account passwords or phone unlock patterns. It's a perfectly normal thing that many people (including myself) do, because it enables ad-hoc delegation. "Honey, can you copy those photos to my laptop and send them to godparents?", asks my wife as she hands me her phone and runs to help our daughter with something - implicitly trusting me with access to her phone, thumbdrive, Windows account, e-mail account, and WhatsApp/Messenger accounts.
This kind of ad-hoc requests happen for us regularly, in both directions, without giving it much of a thought[0]. It's common between couples, variants of that are also common within family (e.g. grandparents delegating most of computer stuff to their adult kids on an ad-hoc basis), and variants of that also happen regularly in workplaces[1], despite the whole corporate and legal bureaucracy trying its best to prevent it[2].
> Government IDs have recovery because the government is the trusted authority that verified you exist in the first place. Microsoft didn't issue your birth certificate.
But Microsoft issued your copy of Windows and Bitlocker and is the one responsible for your data getting encrypted. It's obvious for people to seek recourse with them. This is how it works in every industry other than tech, which is why I'm a supporter of governments actually regulating in requirements for tech companies to offer proper customer support, and stop with the "screw up managing 2FA recovery keys, lose your account forever" bullshit.
> Banks can reset your PIN because they're heavily regulated entities with legal obligations and actual consequences for breaching trust.
As it should be. As it works everywhere, except tech, and especially except in the minds of security aficionados.
> Nuclear launch codes are literally designed around not giving any single entity complete access, hence the two-person rule and multiple independent key holders.
Point being, if enough right people want the nukes to be launched, the nukes will be launched. This is about the highest degree of responsibility on the planet, and relevant systems do not have the property of "lose the encryption key we told you 5 years ago to write down, and it's mathematically proven that no one can ever access the system anymore". It would be stupid to demand that.
That's the difference between infosec industry and real life: in real life, there is always a way to recover. Infosec is trying to normalize data and access being fundamentally unrecoverable after even a slightest fuckup, which is a degree of risk individuals and society have not internalized yet, and are not equipped to handle.
> Right, so the solution is clearly to hand those keys to a corporation that's subject to government data requests, has been breached multiple times, and whose interests fundamentally don't align with yours?
Yes. For normal people, Microsoft is not a threat actor here. Nor is the government. Microsoft is offering a feature that keeps your data safe from thieves and stalkers (and arguably even organized crime), but that doesn't require you to suddenly treat your laptop with more care than you treat your government ID. They can do this, because for users of this feature, Microsoft is a trusted party.
Ultimately, that's what security aficionados and cryptocurrency people don't get: the world runs on trust. Trust is a feature.
--
[0] - Though less and less of that because everyone and their dog now wants to require 2FA for everything. Instead of getting the hint that passwords are not meant to identify a specific individual, they're doubling down and tying every other operation to a mobile phone, so delegating desktop operations often requires handing over your phone as well, defeating the whole point. This is precisely what I mean by the industry not recognizing or supporting the concept of delegation of authority.
[1] - The infamous practice of writing passwords on post-it notes isn't just because of onerous password requirements, it's also a way to facilitate temporary delegation of authority. "Can you do X for me? Password is on a post-it in the top drawer."
[2] - GDPR or not, I still heard from doctors I know personally that sharing passwords to access patient data is common, and so is bringing some of it back home on a thumb drive, to do some work after hours. On the one hand, this creates some privacy risks for patient (and legal risk for hospitals) - but on the other hand, these doctors don't do it because they hate GDPR or their patients. They do it because it's the only way they can actually do their jobs effectively. If rules were actually enforced to prevent it, people would die. This is what I mean when I say that security advice is often dumb and out of touch with reality, and ignored for very good reasons.
Your entire argument rests on conflating "trust" with "blind dependency on a third party subject to legal compulsion".
> Okay, then take sharing your PINs with your spouse.
Sharing with your spouse is consensual, temporary, and revocable. You know you've done it, you trust that specific person, and you can change it later. Uploading your keys to Microsoft is none of these things.
> But Microsoft issued your copy of Windows and Bitlocker and is the one responsible for your data getting encrypted.
Microsoft sold you software. They didn't verify your identity, they're not a regulated financial institution, and they have no duty of care beyond their terms of service. The fact that they encrypted your drive doesn't make them a trustworthy custodian of the keys any more than your locksmith is entitled to copies of your house keys.
> For normal people, Microsoft is not a threat actor here. Nor is the government.
"Normal people" includes journalists, lawyers, activists, abuse survivors, and anyone else Microsoft might be legally compelled to surveil. Your threat model is "thieves and stalkers". Mine includes the state. Both are valid, but only one of us is forcing our model on everyone by default.
> the world runs on trust. Trust is a feature.
Trust in the wrong entity is a vulnerability. You're arguing we should trust a corporation with a legal department larger than most countries' regulators, one that's repeatedly been breached and is subject to government data requests in every jurisdiction it operates.
Your doctors-breaking-GDPR example is particularly telling: you've observed that bad UX causes people to route around security, and concluded that security is the problem rather than the UX. The solution to "delegation is hard" isn't "give up and trust corporations". It's "build better delegation mechanisms". One is an engineering problem. The other is surrender dressed as pragmatism.
So what happens if your motherboard gets fried and you don’t have backups of your recovery key or your data? TPMs do fail on occasion. A bank PIN you can call and reset, they can already verify your identity through other means.
> So what happens if your motherboard gets fried and you don't have backups of your recovery key or your data?
If you don't have backups of your data, you've already lost regardless of where your recovery key lives. That's not an encryption problem, that's a "you didn't do backups" problem, which, I'll agree is a common issue. I wonder if the largest software company on the planet (with an operating system in practically every home) can help with making that better. Seems like Apple can, weird.
> TPMs do fail on occasion.
So do Microsoft's servers. Except Microsoft's servers are a target worth attacking, whereas your TPM isn't. When was the last time you heard about a targeted nation-state attack on someone's motherboard TPM versus a data breach at a cloud provider?
> A bank PIN you can call and reset, they can already verify your identity through other means.
Banks can do that because they're regulated financial institutions with actual legal obligations and consequences for getting it wrong. They also verified your identity when you opened the account, using government ID and proof of address.
Microsoft is not your bank, not your government, and has no such obligations. When they hand your keys to law enforcement, which they're legally compelled to do, you don't get a phone call asking if that's alright.
The solution to TPM failure is a local backup of your recovery key, stored securely. Not uploading it to someone else's computer and hoping for the best.
> I wonder if the largest software company on the planet (with an operating system in practically every home) can help with making that better. Seems like Apple can, weird.
If you're talking about time machine, windows has had options built in since NT.
There are a lot of people here criticising MSFT for implementing a perfectly reasonable encryption scheme.
This isn’t some secret backdoor, but a huge security improvement for end-users. This mechanism is what allows FDE to be on by default, just like (unencrypted) iCloud backups do for Apple users.
Calling bs on people trying to paint this as something it’s not is not “whiteknighting”.
Yes, because object level facts matter, and it's intellectually dishonest to ignore the facts and go straight into analyzing which side is the most righteous, like:
>Microsoft is an evil corporation, so we must take all bad stories about them at face value. You're not some corpo bootlicker, now, are you? Now, in unrelated news, I heard Pfizer, another evil corporation with a dodgy history[1] is insisting their vaccines are safe...
Microsoft doesn't take the screenshot; their operating system does if Recall is enabled, and although the screenshots themselves are stored in an insecure format and location, Microsoft doesn't get them by default.
> If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
Yes. The thing is: Microsoft made the design decision to copy the keys to the cloud, in plaintext. And they made this decision with the full knowledge that the cops could ask for the data.
You can encrypt secrets end-to-end - just look at how password managers work - and it means the cops can only subpoena the useless ciphertext. But Microsoft decided not to do that.
I dread to think how their passkeys implementation works.
> Yes. The thing is: Microsoft made the design decision to copy the keys to the cloud, in plaintext. And they made this decision with the full knowledge that the cops could ask for the data.
Apple does this too. So does Google. This is nothing new.
It's a commonly used feature by the average user who loses their password or their last device.
During set up, they even explicitly inform the user that their bitlocker keys are being backed up to the cloud. And, you can still choose to use bitlocker without key escrow.
If the user's MacOS FileVault disk encryption key is "stored in iCloud" it resides in the users iCloud Keychain which is end-to-end encrypted. This creates a situation similar to the iPhone, where Apple does not have the ability to access the user's data and therefore cannot comply with a warrant for access (which really annoys organizations like the FBI and Interpol)
Power users should stop bothering with Windows nonsense and install Linux instead so that they can actually have control over their system.
It's 2026. The abuses of corporations are well documented. Anyone who still chooses Windows of their own volition is quite literally asking for it and they deserve everything that happens to them.
You only have to run through a modern Windows installer to understand how screwed you are if you install it. Last time I did this for a disposable Windows VM (a couple of years ago) I remember having to click through a whole bunch of prompts asking about all the different types of data Microsoft wanted my computer to send them. Often the available answers weren't "yes" or "no" but more like "share all data" vs "share just some data". After that I recall being forced to sign up for an outlook account just to create a local login unless I unplugged my network cable during the install. I've heard they have closed that loophole in recent installers.
I'd already long since migrated away from Windows but if I'd been harbouring any lingering doubts, that was enough to remove them.
I’ll bite. What Linux distro currently has the nicest desktop experience? I work on a MacBook but my desktop is a windows PC that I use for gaming and personal projects. I hear Proton has made the former pretty good now, and the latter is mostly in WSL for me anyway. Maybe a good time to try.
What do you suggest? I’ll try it in a VM or live usb.
Bazzite. It's KDE, it's easy, it's immutable so you can update and it's unlikely to break shit. It comes with Steam already. Keyboard shortcuts very similar to Windows. Dolphin (File Explorer equivalent) responds as quickly as one would expect File Explorer to respond if it were developed by sane people. You also get an Android-style permission system with Flatseal, so you can disable permissions for various applications.
One warning: keep in mind that if your desktop PC motherboard has a mediatek wifi+bluetooth chip, that chip will probably not work on any version Linux (AFAIK). I don't use wifi on my desktop but I do use bluetooth game controllers. You can replace the chip (which is what I did, with https://www.amazon.com/dp/B08MJLPZPL), get a bluetooth dongle (my friend recommends https://www.amazon.com/Bluetooth-Wireless-External-Receiver-...), or get a PCIe one.
There are so many distros that it really depends on your use-case and it's hard to make a generic suggestion. Ubuntu is a common recommendation for first timers, mainly because as the most popular distro you'll easily be able to Google when you need help with something, and it also uses the most popular package format (.deb). There's also Linux Mint which is basically Ubuntu but with some of the latter's more questionable choices removed (e.g. snaps) and minus the big corp owner. By using one of these you'll also be learning skills relevant to Debian (which Ubuntu is derived from) which is a solid choice for servers.
Regardless of which distro you choose, your "desktop experience" will be mostly based on what desktop environment you pick, and you are free to switch between them regardless of distro. Ubuntu for example provides various installers that come with different DEs installed by default (they call them "flavours": https://ubuntu.com/desktop/flavors), but you can also just switch them after installation. I say "mostly" because some distros will also customise the DE a bit, so you might find some differences.
"Nicest desktop experience" is also too generic to really give a proper suggestion. There are DEs which aim to be modern and slick (e.g. GNOME, KDE Plasma, Cinnamon), lightweight (LXQt), or somewhere in between (Xfce). For power users there's a multitude of tiling window managers (where you control windows with a keyboard). Popular choices there are i3/sway or, lately, Niri. All of these are just examples, there are plenty more DEs / WMs to pick from.
Overall my suggestion would be to start with something straightforward (Mint would probably be my first choice here), try all the most popular DEs and pick the one you like, then eventually (months or years later) switch to a more advanced distro once you know more what your goals are and how you want to use the system. For example I'm in the middle of migrating to NixOS because I want a fully declarative system which gives the freedom to experiment without breaking your system because you can switch between different temporary environments or just rollback to previous generations. But I definitely wouldn't have been ready for that at the outset as it's way more complex than a more traditional distro.
Something with KDE. Never used KDE extensively because I hate non-tiling WMs, but something like Kubuntu would give you a more windows-esque experience by default. Here's the download link:
I don't use KDE either, but it does seem to be the most Windows adjacent choice. Unless you like very old versions of Windows in which case you may prefer XFCE like me (Xubuntu or the xfce variant of Linux mint).
I heard Kubuntu is not a great distro for KDE, but I can't comment on that personally.
That's literally like asking "What car has the best driving experience?". There is no one answer.
If you want something that "just works," Linux Mint[1] is a great starting point. That gets you into Linux without any headache. Then, later when bored, you can branch out into the thousands[2] of Linux distributions that fill every possible niche
I would never, recommend anything from Debian-family for consumer use. Its literally outdated linux, under the marketing 'stable'.
Fedora is so significantly better.
I wouldn't confuse popularity for good. Ubuntu gave away free CDs in the 2000s and are living off old marketing.
Debian family is so bad. You will be in the terminal constantly just trying to get stuff to work. Stick to a well maintained, up to date, consumer distro, Fedora.
If you want maximum commodity and as many things to "just work" as possible out of the box, go for good old plain Ubuntu.
If you care a little more about your privacy and is willing to sacrifice some commodity, go for Fedora. It's community run and fairly robust. You may have issues with media codecs, nvidia drivers and few other wrinkles though. The "workstation" flavor is the most mature, but you may want to give the KDE version a try.
If you want an adventure, try everything else people are recommending here :)
Not sure it's good as a starter distro, but other than that I agree. I was put off NixOS for a long time despite loving the principles behind it. Then a few weeks ago I had ChatGPT give me a short course on it, including flakes and the basics of the Nix language. I completed that in a few hours and achieved more than I ever had reading the Nix docs and blogs etc. Now I'm able to use an LLM to help me write flakes while also understanding what it is doing (I'm not a fan of blindly using AI generated code).
That's what I'm getting at - the nixos learning curve is flattened out completely with LLMs to the point that I do recommend it as a starter distro for anyone technically competent (as it's still crucial to actually read and understand what the LLM produces)
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
The real issue is that you can't be sure that the keys aren't uploaded even if you opt out.
At this point, the only thing that can restore trust in Microsoft is open sourcing Windows.
Last time I needed to install Windows 11, avoiding making a Microsoft account required (1) opening a command line to run `oobe/bypassnro`, and (2) skipping past the wifi config screen. While these are quick steps, neither of those are at all "easy", since they require a user to first know that it is an option in the first place.
And newer builds of Windows 11 are removing these methods, to force use of a Microsoft account. [0]
> it was really easy to set up without a Microsoft account.
By "really easy" do you mean you had a checkbox? Or "really easy" in that there's a secret sequence of key presses at one point during setup? Or was it the domain join method?
Googling around, I'm not sure any of the methods could be described as "really easy" since it takes a lot of knowledge to do it.
I recently had to install Windows for the first time in ages because reasons, and it really wasn’t very hard. The setup really just presents two options at a time: the cloudy option, and the other option. If in doubt, the flashy one is the cloudy one. I kept selecting the non cloudy option and got to the desktop without signing up for anything. Sure it took more clicking than last time I went through this, but really wasn’t nearly as bad as people say and didn’t take any windows know-how or googling. Might be very different between editions and regions though…
Edit: ofc we all agree local accounts needs to be a supported option, but perhaps we should be more careful about yelling from the rooftops that it’s practically impossible. I’ve been told for years now that it’s really hard or impossible, and it really was not that hard (yet…)
The same way you know that your browser session secrets, bank account information, crypto private keys, and other sensitive information is never uploaded. That is to say, you don't, really - you have to partially trust Microsoft and partially rely on folks that do black-box testing, network analysis, decompilation, and other investigative techniques on closed-source software.
I'm not sure how to do this on Windows, but to disable FileVault cloud key backup on Mac, go to `Settings > Users & Groups > click on the (i) tooltip next to your account` and uncheck "Allow user to reset password using Apple Account".
This is a part of Settings that you will never see at a passing glance, so it's easy to forget that you may have it on.
I'd also like to gently push back against the cynicism expressed about having a feature like this. There are more people who benefit from a feature like this than not. They're more likely thinking "I forgot my password and I want to get the pictures of my family back" than fully internalizing the principles and practices of self custody - one of which is that if you lose your keys, you lose everything.
I’m not sure if you misunderstand how macOS accounts work or how FileVault works.
There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account. Either of these types of accounts may be associated with an iCloud account. macOS doesn’t work like Windows where your Microsoft account is your login credential for the local machine.
FileVault key escrow is something you can enable when enabling FileVault, usually during initial machine setup. You must be logged into iCloud (which happens in a previous step of the Setup Assistant) and have iCloud Keychain enabled. The key that wraps the FileVault volume encryption key will be stored in your iCloud Keychain, which is end-to-end encrypted with a key that Apple does not have access to.
If you are locked out of your FileVault-encrypted laptop (e.g. your local user account has been deleted or its password has been changed, and therefore you cannot provide the key to decrypt the volume encryption key), you can instead provide your iCloud credentials, which will use the wrapping key stored in escrow to decrypt the volume encryption key. This will get you access to the drive so you can copy data off or restore your local account credentials.
> There are two ways to log into macOS: a local user account or an LDAP (e.g. OpenDirectory, Active Directory) account.
And just in case it wasn't clear enough, I'd add: a local user account is standard. The only way you'd end up with an LDAP account is if you're in an organization that deliberately set your computer up for networked login; it's not a typical configuration, nor is it a component used by iCloud.
As someone who has benefiter ones from this, I have to say: good.
In my humble opinion: the current state is better than no encryption at all. For example: Laptop theft, scavengers trying to find pictures, etc. And if you think you are target of either Microsoft or the law enforcement manage your keys yourself or go straight to Linux.
MacOS has this feature as well. It used to be called "Allow my iCloud account to unlock my disk," but it keeps getting renamed and moved around in new MacOS versions. I think it's now tied together with remote password resets into one option called "allow user to reset password using Apple Account."
To be fair, which makes it even more ominous with Apple. At least Microsoft explicitly informs you during setup and isn't trying to hide it behind some vague language about "resetting password".
Exactly. And any halfway decent corporate IT setup would be managing the keys themselves as well (although I would imagine many third party tools could also be compelled to do this with a proper warrant)
Bitlocker on by default (even if Microsoft does have the keys and complies with warrants) is still a hell if a lot better than the old default of no encryption. At least some rando can't steal your laptop, pop out the HDD, and take whatever data they want.
The issue is about getting locked out of your own data, which can easily happen in a number of cases.
And you don't necessarily need to actually have your account banned.
Let's just say you signed up for a Microsoft account when setting up for a new PC (well, because you have to). You don't use that account anywhere else, and you forgot the password, even though you can log in via PIN or something else. Now you install Linux or just boot to a different system once. When you need to boot to Windows again, good luck.
And that's just one of the cases.
A real disaster happened to someone, although on a different platform, and the context is a bit different: https://hey.paris/posts/appleid/
Users absolutely 100% will lose their password and recovery key and not understand that even if the bytes are on a desk physically next to you, they are gone. Gone baby gone.
In university, I helped a friend set up encryption on a drive w/ his work after a pen drive with work on it was stolen. He insisted he would not lose the password. We went through the discussion of "this is real encryption. If you lose the password, you may as well have wiped the files. It is not in any way recoverable. I need you to understand this."
Some people will hurt themselves if given dangerous tools, but if you take all the dangerous items out of the tool shop, there won't be any tools left.
Microsoft seems to feel constant pressure to dumb Windows down, but if you look at the reasons people state when switching to Linux, control is a frequent theme. People want the dangerous power tools.
Tool manufacturers include all kinds of annoying safety devices to attempt to prevent injury, or at least to give them some cover in a lawsuit.
Table saw blade guards and riving knives are an ironic example here: I've yet to hear a story of a woodworker that lost a finger on a table saw that wouldn't have been able to avoid that injury if they kept one of those safety devices on the saw. Everyone thinks the annoyance isn't worth it, since they are an 'expert', yet it happens frequently.
Right, but none of those safety devices invalidate the underlying purpose of the tools. Disk encryption is used, for many people, for privacy. Uploading the keys to Microsoft defeats a lot of that.
If you bought a table saw and the "safety device" is that it won't run, I would imagine you'd be pissed too.
Okay, so then the default for 95% of users is no encryption at all and police (or the far more likely thief, roommate, etc) don't even have to bother with a warrant to get all your data.
Because now all the people at the computer recycle shop can't access all your old files including your family photos and saved passwords. They'd be missing out on all that fun.
At Microsoft-scale, data requests from law enforcement are an inevitability. Designing a system such that their requests are answerable is a choice. Signal's cloud backup system is an example of a different choice being made.
In court? Not really. These warrants are on solid ground from a legal standpoint. To the point that fighting them could be a sanction-able kind of grandstanding.
The "Microsoft gave" framing is the exact right wording!, because Microsoft should never have had these keys in the first place. This is a compromise on security that sidesteps back doors on the low level and essentially transforms all Windows installations into Clipper-chip products.
To be fair, if they didn't have BitLocker enabled at all, the FBI would have just scanned the hard-drive as-is. The only usefulness of BitLocker is if a stranger steals your laptop, assuming Microsoft doesn't hand out the keys to just anybody, your files should be safe, in theory.
> Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
I’m not sure how you’re criticizing the “gave” framing when you’re describing and stating Microsoft literally giving the keys to the FBI.
Because "gave" implies a favor or a one sided exchange. It implies that Microsoft is just giving away keys for no reason!
Better, and more accurate wording, would be that "Microsoft surrendered keys" or "Microsoft ceded keys". Or "Microsoft legally compelled to give the keys". If Microsoft did so without a warrant, then "gave" would be more tonally accurate.
In addition, none of this is new. They've been turning over keys when legally compelled to, for many years.
The fact that none of this is new undermines your point. Microsoft knew that law enforcement would ask for keys, based on their prior experience and the sack of meat sitting between their ears.
They, knowing that, chose to design a system that trivially allows this. That is a choice. In that sense, they did give up the keys. They certainly did not have to design it that way, nor was it done in ignorance.
In fairness, the link is specifically for "Advanced Dat Protection for iCloud". This has nothing to do with local whole-disk encryption like FileVault or BitLocker.
In Apple's case, even when the user enables iCloud FileVault key backup, that key is still end-to-end encrypted and Apple cannot access it. As a matter of fact, while Apple regularly receives legal warrants for access, they are ineffective because Apple has no way to fulfill that request/requirement.
Microsoft has chosen to store the BitLocker key backups in a manner that maintains their (Microsoft's) access. But, this is a choice Microsoft has made its not an intrinsic requirement of a key escrow system. And in the end, it enables law enforcement to compel them to turn over these keys when a judge issues a warrant.
> Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
Often it is the case that companies hand over private data to law enforcement just by being asked for it nicely, no warrant needed.
While it is true that NSLs or other coercion tactics will force them to give out the keys, it is also true that this is only possible because Microsoft implemented a fatally flawed system where they have access to the keys.
Any system where a third party has access to cleartext or the keys to decrypt to cleartext is completely broken and must not be used.
Microsoft did give them. Just because they have a warrant doesn't mean keys should be handed over in any usable form. As indicated in the Forbes [0] article - both Meta and Apple have the exact same convenience in place (cloud backup) with none of the direct risk.
So, yes. That is how it works: 1) Microsoft forces users to online accounts 2) Bitlocker keys are stored in an insecure manner allowing any US agency to ask for them. I intentionally say "ask for them" because the US government is a joke with respect to respecting its own citizens privacy [1] at this point.
This type of apologetic half-truth on behalf of a multi-billion dollar corporation is getting old fast.
There is no other way for this to work that won't result in an absolutely massive number of people losing their data permanently who had no idea
their drive was encrypted. Well there is, leave BitLocker disabled by default and the drive unencrypted. Now the police don't even have to ask!
With this scheme the drive is recoverable by the user and unreadable to everyone except you, Microsoft, and the police. Surely that's a massive improvement over sitting in plaintext readable by the world. The people who are prepared to do proper key management will know how to do it themselves.
Apple does the same thing with FileVault when you set up with your iCloud account where, again, previously your disk was just left unencrypted.
Security is not a switch you can turn on and forget about. Plus the police have extraordinary real world powers to compel you to disclose the necessary information anyways. Unless you're holding state secrets, which, c'mon, you're almost certainly going to give in and cooperate at some point. It wouldn't make for a great Hollywood movie but it would accurately reflect day to day reality.
> unreadable to everyone except you, Microsoft, and the police.
That's two too many. It should either be unreadable to everyone but me or readable by anyone with physical access. Does it not occur to people that you can still rely on physical security even in computing?
> Apple does the same thing
The two corporate computing giants do the same thing? I am not surprised but I also don't see it as a worthwhile data point.
"Apple does the same thing with FileVault when you set up with your iCloud account where, again, previously your disk was just left unencrypted"
Nah, the FileVault key is stored in your iCloud Keychain when you choose to backup the key to iCloud. And the keychain is end-to-end encrypted. Only the user has access.
All that is true and the spin I focus on is can Microsoft have implemented it such that they have zero (ish) knowledge by default.
We know iCloud has configurations that can’t disclosed, and I wonder if there is a middle ground between if you loose the recovery key you are stuffed and maybe have a recovery key unblocked by a password similar to ssh keys
>Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
I have W11 w a local account and no bitlocker on my desktop computer, but the sheer amount of nonsense MS has been doing these days has really made me question if 'easy modding*' is really enough of a benefit for me to not just nuke it and install linux yet again
* You can get the MO2 mod manager running under linux, but it's a pain, much like you can also supposedly run executable mods (downgraders, engine patches, etc) in the game's context, but again, pain
Correct me if I'm wrong, but isn't forcing you to divulge your encryption password compelled speech? So the police can crack my phone but they can't force me to tell them my PIN.
Yes, you cannot be compelled to testify against yourself, but Microsoft is under no such obligation when served a warrant because of third party doctrine. Microsoft holding bitlocker recovery keys is considered you voluntarily giving the information to a third party, so the warrant isn't compelling you to do anything, so not a rights violation.
But, the 5th amendment is also why its important to not rely on biometrics. Generally (there are some gray areas) in the US you cannot be compelled to give up your password, but biometrics are viewed as physical evidence and not protected by the 5th.
Warrants are a mechanism by which speech is legally compelled.
The 5th Amendment gives you the right to refuse speech that might implicate you in a crime. It doesn’t protect Microsoft from being compelled to provide information that may implicate one of its customers in a crime.
Indeed. Third Party Doctrine has undermined 4th/5th Amendment protections due to the hair brained power grab that was "if you share info with a third party as art of the only way of doing business, you waive 4th Amendment protections. I ironically, Boomers basically knee-capped Constitutional protections for the very data most critically in need of protection in a network state.
Only fix is apparently waiting until enough for to cram through an Amendment/set a precedent to fix it.
Well, SCOTUS has ummed and erred over several cases about whether to extend the 4th Amend to third party data in some scenarios. IIRC there is an online email case working up through 9th Cir right now?
One of the reasons giving for (usually) now requiring a warrant to open your phone they grab from you is because of the amount of third-party data you can access through it, although IIRC they framed is a regular 4th Amend issue by saying if you had a security camera inside your house the police would be bypassing the warrant requirement by seeing directly into your abode.
They can't force you to tell them your PIN in some countries, but they can try all PINs, and they can search your desk drawer to find the post-it where you wrote your PIN.
Good PINs are ones you're not allowed to brute force. You can easily configure an iPhone to wipe itself after too many wrong guesses. There's a single checkbox labeled "Erase Data", saying "Erase all data on this iPhone after 10 failed passcode attempts."
> Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
You mean "Install Linux",because that's easier than dealing with the steps required to do that on Windows
The reasonable default is transparency about it and 2FA for recovery scenarios. MS does not have to have the keys in the clear, as it is reasonable for any secrets you store.
The same is true for Apple laptops! Take a look in your Passwords app and you will see it automatically saves and syncs your laptop decryption key into the cloud.
So all the state needs to get into your laptop is to get access from Apple to your iCloud account.
"For additional privacy and security, 15 data categories — including Health and passwords in iCloud Keychain — are end-to-end encrypted. Apple doesn't have the encryption keys for these categories, and we can't help you recover this data if you lose access to your account. The table below includes a list of data categories that are always protected by end-to-end encryption."
The FileVault keys are stored in the iCloud Keychain and Apple does not have access to them, full stop :-)
You are conflating iCloud Keychain with the rest of the iCloud data. iCloud keychain is always end-to-end encrypted. Apple cannot decrypt it even if they receive a subpoena. The other iCloud data like your photos are not end-to-end encrypted by default unless you turn on Advanced Data Protection (ADP).
In the news article you shared above, it's very likely this person did not have ADP turned on. So everything in their iCloud that is not E2EE by default could be decrypted by Apple.
The apple support link above has a table showing what apple has access to depending on if the user has Advanced Data Protection on or not.
The link you posted shows that the FBI got access to icloud and found screenshots saved there -- not the device; if the guy would have had ADP on all the FBI would get is mail, contacts, calendar data saved to icloud as Apple wouldn't have the key for the rest of it.
There needs to be more awareness into setting up W11 install ISO's which can be modified to disable bitlocker by default, disable the online account requirement.
I recently needed to make a bootable key and found that Rufus out of the box allows you to modify the installer, game changer.
This. Real "power users" (as opposed to people who aren't completely computer-illiterate) use the likes of Arch Linux and Gentoo and self-host whatever "cloud" services they need, they aren't running Windows and paying for Copilot 365 subscriptions.
"enemy of the state" depends a lot on the current state of the state.
Eg in England you're already an enemy of the state when you protest against Israel's actions in Gaza. In America if you don't like civilians being executed by ICE.
This is really a bad time to throw "enemy of the state" around as if this only applies to the worst people.
Current developments are the ideal time to show that these powers can be abused.
Very much hyperbolic about the UK. You’re fine protesting against Israel, but Palestine Action is a proscribed group (not that I agree with that!) and that will land you in trouble.
No you aren't,why are you lying. You can protest all you want,the only time people got in trouble was because of the Nazi flags the protestors were using and extreme Islamists trying to recruit terrorists.
> Are we calling everyone who wants some control over their computers enemies of the state?
As of today at 00:00 UTC, no.
But there's an increasingly possible future
where authoritarian governments will brand users
who practice 'non-prescribed use' as enemies of the state.
And when we have a government who's leader
openly gifts deep, direct access to federal power
to unethical tech leaders who've funded elections (ex:Thiel),
that branding would be a powerful perk to have access to
(even if indirectly).
You're not going to avoid any state surveillance if the state is really interested in you specifically.
But you can still help prevent abuses of mass surveillance without probable cause by making such surveillance as expensive and difficult as possible for the state
> This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user.
Absolutely not. If my laptop tells me that it is encrypted by default, I don't like that the default is to also hold a copy of the keys in case big brother wants them.
Call me a "privacy purist" all you want, but it shouldn't be normal to expect the government to have access to a key to your house.
I think this is a fair position and believe you're making it in good faith, but I can't help but disagree.
I think the reasonable default here would be to not upload to MS severs without explicit consent about what that means in practise. I suspect if you actually asked the average person if they're okay with MS having access to all of the data on their device (including browser history, emails, photos) they'd probably say no if they could.
Maybe I'm wrong though... I admit I have a bad theory of mind when it comes to this stuff because I struggle to understand why people don't value privacy more.
> Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works.
Companies know that putting themselves in a position where they can betray their users, means they will be forced to do so. Famously demonstrated when Apple had to ban the Hong Kong protest app [1]. Yet they continue to do it, don't inform their users, and in the rare occasion that they offer an alternative, it is made unclear and complicated and easy to get wrong [2].
> Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
These two statements are in no way mutually exclusive. Microsoft is gobbling up your supposedly private encryption keys because they love cops and want an excuse to give your supposedly private data to cops.
Microsoft could simply not collect your keys and then would have no reason or excuse to hand them to cops.
Similar case with Apple devices. They default to backing up to Apple servers where they are unencrypted. So they can provide data to police if requested. But for anyone concerned about privacy they can use Advanced Data Protection which encrypts all their data and prevents Apple from reading it or recovering it.
Definitely agree that choices like these are the most sane for the default user experience and that having these advanced options for power users to do with it what they want is a fair compromise. Wish more people were open to designing software for the average person and compromising on a middle ground the benefits both kinds of users.
Yeah guys, if it's encrypted by default, it's not a violation of user security or privacy expectations to have a set of master keys that you hold onto and give to third parties to decrypt user devices. I mean it was just encrypted by default... by default...
Microsoft could have done key backups to secure enclaves that will only return them to a user able to produce valid signatures using a backup code or otherwise they hold. Hell they were the ones that normalized remote attestation.
But Microsoft chose to keep them plain text, and thus they are, and will continue to be abused.
We must not victim blame. This is absolutely corruption on microsofts part.
can they compel testimony? keys, passcodes and the like are usually considered testimony. did they try? the usual story here is that they don't have to, that the big corporations will turn over any info they have on request because they can and the government makes a better friend than a single user. the article mentions 20 "requests" per year on average but doesn't say anything about the government using force.
I agree with your conclusion though: data you share with anyone is data you've shared with everyone and that includes your encryption keys. if that matters to you, then you need to take active steps to ensure your own security because compelled or not, the cloud providers aren't here to help keep you safe.
"They have no choice" because they're "just doing their job" and "following the law."
Which are both choices. Microsoft can for sure choose to block the government and so can individual workers. Let's not continue the fascism-enabling narratives of "no choice."
Tl;dr - "Basically, you’re either dealing with Mossad or not-Mossad. If your adversary is not-Mossad, then you’ll probably be fine if you pick a good password and don’t respond to emails from ChEaPestPAiNPi11s@ virus-basket.biz.ru. If your adversary is the Mossad, YOU’RE GONNA DIE AND THERE’S NOTHING THAT YOU CAN DO ABOUT IT" (Mickens, 2014)
Microsoft shouldn't be uploading keys, but nor should they be turning bitlocker on without proper key backup. Therefore it should be left as an optional feature.
The quality of journalism you consume is highly dependent on the sources you choose. Some outlets still highly value journalistic integrity. I prefer to read those. Not that any of them are perfect. But it makes a huge difference and they typically provide a much more nuanced view. The Atlantic and the Wall Street Journal are good examples of this in my opinion.
>The defaults will also upload the BitLocker key to a Microsoft Account if available.
>This is why the FBI can compel Microsoft to provide the keys.
>in my opinion it's the reasonable default
I really can't imagine what kind of person would say that with a straight face. Hanlon's razor be damned, I have to ask: are you a Microsoft employee or investor?
> Back in the day hackernews had some fire and resistance.
Most of the comments are fire and resistance, but they commonly take ragebait and run with the assumptions built-in to clickbait headlines.
> Too many tech workers decided to rollover for the government and that's why we are in this mess now.
I take it you've never worked at a company when law enforcement comes knocking for data?
The internet tough guy fantasy where you boldly refuse to provide the data doesn't last very long when you realize that it just means you're going to be crushed by the law and they're getting the data anyway.
> I take it you've never worked at a company when law enforcement comes knocking for data?
The solution to that is to not have the data in the first place. You can't avoid the warrants for data if you collect it, so the next best thing is to not collect it in the first place.
The technology exists to trivially encrypt your data if you want to. That's not a product most people want, because the vast majority of people (1) will forget their password and don't want to lose their data, and (2) aren't particularly worried about the feds barging in and taking their laptop during a criminal investigation.
That's not what the idealists want, but that's the way the market works. When the state has a warrant, and you've got a backdoor, you're going to need to give the state the keys to the backdoor.
There are some errors in what you write, and despite that, it is not clear to me what the supposed ‘realization’ would be.
1. The famous 2016 San Bernardino case predates Advanced Data Protection technology of iCloud backups. It was never about encryption keys, it was about signing a ‘bad’ iOS update.
2. Details are limited, but it involved a third-party exploit to gain access to the device, not to break the encryption (directly). These are different things and should both be addressed for security, but separately.
Evidently, after this case ended, Apple continued its efforts. It rolled out protecting backups from Apple, and the requirement of successful user authentication before installing iOS updates (which is also protecting against Apple or stolen signing keys).
Plenty of companies would do that if they could. The problem is it has become illegal for them to do that now. KYC/AML laws form the financial arm of warrantless global mass surveillance.
Where I live, government passed a similar law to the UK's online identification law not too long ago. It creates obligations for operating system vendors to provide secure identity verification mechanisms. Can't just ask the user if they're over 18 and believe the answer.
The goal is of course to censor social media platforms by "regulating" them under the guise of protecting children. In practice the law is meant for and will probably impact the mobile platforms, but if interpreted literally it essentially makes free computers illegal. The implication is that only corporation owned computers will be allowed to participate in computer networks because only they are "secure enough". People with their own Linux systems need not apply because if you own your machine you can easily bypass these idiotic verifications.
In Brazil, where I live, it's law 15.211/2025. It makes it so that the tech industry must verify everyone's identity in order to proactively ban children from the harmful activities. It explicitly mentions "terminal operating systems" when defining which softwares the law is supposed to regulate.
If you design it so you don't have access to the data, what can they do? I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.
If you design it so you don't have access to the data, how do you make money?
Microsoft (and every other corporation) wants your data. They don't want to be a responsible custodian of your data, they want to sell it and use it for advertising and maintaining good relationships with governments around the world.
> If you design it so you don't have access to the data, how do you make money?
The same way companies used to make money, before they started bulk harvesting of data and forcing ads into products that we're _already_ _paying_ _for_?
I wish people would have integrity instead of squeezing out every little bit of profit from us they can.
People arguably cannot have integrity unless all other companies they compete with also have integrity. The answer is legislation. We have no reason to allow our government to use “private” companies to do what they cannot then turn over the results to government agencies. Especially when willfully incompetence.
The same can be said of using “allies” to mutually snoop on citizens then turning over data.
> I'm sure there's some cryptographic way to avoid Microsoft having direct access to the keys here.
FTA (3rd paragraph): don't default upload the keys to MSFT.
>If you design it so you don't have access to the data, what can they do?
You don't have access to your own data? If not, they can compel you to reveal testimony on who/what is the next step to accessing the data, and they chase that.
Doesn't sound like it tells you now that it's default, but I'll see what it says next time. If they make the key-sharing clear and make it easy to disable, then it's fine.
> Too many tech workers decided to rollover for the government and that's why we are in this mess now.
It has nothing to do with the state and has to do with getting the RSUs to pay the down payment for a house in a HCOL area in order to maybe have children before 40 and make the KPIs so you don't get stack-ranked into the bottom 30% and fired at big tech, or grinding 996 to make your investors richest and you rich-ish in the process if you're unlikely enough to exit in the upper decile with your idea. This doesn't include the contingent of people who fundamentally believe in the state, too.
Most people are activists only to the point of where it begins to impede on their comfort.
The engineers who developed this developed it to a spec so that microsoft demanded that allows them to get into the system at any time. There was nothing lazy about it. This would be easily found by anyone who has the impetus to encrypt their drive. Don't put things on your work laptop that you don't want Dom down in IT reading all of it or Phil the police forensics dick
it the natural results this site catter not just to tech nerds but one chasing venture capital money. its an inudustry that has never seen a dark patern it didn't like. we have gone from "don't be evil" to "be evil if makes the stonks go up"
I don’t see that at all. Instead, I think tech workers, including the engineers and the product managers, are correctly prioritizing user convenience over resistance to government abuse. It’s honestly the right trade off to make. Most users worry about casual criminals, not governments. Say a criminal snatching your laptop and accessing your files that way. If you worry about governments you should already know what to do.
Look around you. At least in my company half the programmers are H-1B Indians. They're not going to resist anybody with the risk of getting deported back to India.
And too many tech workers decided to rollover for the big companies too. Accepting and advocating whatever they do. Even when it is tricky, can find the way to defend the big names, because they are big names, they know the way, they became big!
I agree with you, but also think this is only true because we as an industry have been so completely corrupted by money at this point.
In the 90s and 00s people overwhelmingly built stuff in tech because they cared about what they were building. The money wasn't bad, but no one started coding for the money. And that mindset was so obvious when you looked at the products and cultures of companies like Google and Microsoft.
Today however people largely come into this industry and stay in it for the money. And increasingly tech products are reflecting the attitudes of those people.
> Back in the day hackernews had some fire and resistance
Hackernews is a public forum, and the people here change constantly. "Back in the day" there were mostly posts about LISP and startup equity. It's obviously not the same people here now.
> Too many tech workers decided to rollover for the government
Again, not the same group of people. In the 2000s "tech workers" might have mostly been Californians. Now they're mostly in India. Differing perspectives on government, to be sure.
> lazy engineers build lazy key escrow
Hey you should know this one, because it's something that HAS stayed constant since "back in the day": The engineers have absolutely no say in this whatsoever.
This is such a lazy take and ignores that this is the only system that has the property of not losing data when users forget their passwords and lose (or likely never write down) their recovery key.
That's it. That's the whole thing. Whatever "secure system" you build will not have this property and users will lose their data, be mad at you, and eventually you'll have to turn it off by default leaving everyone's data in plaintext. It's a compromise that improves security for people who previously left their disk unencrypted. It changes nothing for people who previously did their own key management.
You won't be able to turn the first group into the second group. That's HN's "Average Familiarity" fallacy. The fact that basically every 2FA system has a means of recovering your account by removing it should tell you that even technical people are shit at key management.
Yep... I've seen exactly this happen. People losing data/access by their own fault and yet being extremely mad at the OS developer or the company they have an account with. And, no, it does not matter if you tell them 100 times that they are responsible for not losing their own keys/passwords, they will still be furious that you set up your system in (from their perspective) such a shitty way that it's even possible for a permanent lockout to happen.
Saying "of course" doesn't mean we agree with it or fail to try to resist it. It's simply not surprising that this happened.
When you get high up in an org, choosing Microsoft is the equivalent of the old "nobody ever got fired for buying IBM". You are off-loading responsibility. If you ever get high up at a fortune 500 company, good luck trying to get off of behemoths like Microsoft.
It's why tech loves young engineers who just do what their told, of old engineers only as long as they can't say no. Once you dig into the system and see how all the pieces fit together, you can't ethically or morally continue to participate any longer. Learned that the hard way. In the middle of an attempt at midlife career change because of it to maybe free myself to write software that needs to be written instead of having to have a retained lawyer on hand to wrangle employment contract clauses to keep my work belonging to me.
> Too many tech workers decided to rollover for the government and that's why we are in this mess now.
It isn't really about the government. It's about a bunch of people trying to convince you that the locked-down proprietary closed source corporate crap that they use isn't in and of itself a security risk, no matter what the quality of the code that you've never seen is. Apple, Microsoft, Google etc. aren't your friends; no matter how brand loyal you are, they'll never care whether you're alive or dead.
FOSS isn't your friend either, but they're not asking you to trust them. Any exposure to these world spanning juggernaut military and intelligence contractor companies is a security hole. It's insane that people (thinking of Europeans now) get fired up to switch from this stuff because Trump but not because of course you should. Instead they're busy calling being suspicious of Microsoft old and hatred of Apple's customer corral stuck up and the desire to own your own machine fanatical and judgemental. Have you ever considered that you've been programmed to say and encourage dumb stuff that is completely against your own interests and supports the interests of the people who sell things to you?
You're convinced by the argument that people dumber than you have to be protected from their own machines (by corporations who have no interest in or obligation to protect them) - have you ever thought that people are saying the same thing about you? That you have to be protected from writing things you shouldn't write or talking to people you shouldn't be talking to? And the world isn't a meritocracy: the people on the top are inbred creeps. You've given up your freedom to dummies with marketing departments.
I used to be a principled freedom fighter. But others defected(thinking mostly about Apple users...). I promoted open source software, even dealing with the pains.
So now I just use whatever I want. Someone else can be a tech moralist.
The median user's threat model doesn't include the government, but does include data loss, forgetting the password, or a thief stealing your laptop. Microsoft struck the right balance.
I'm glad the knee-jerk absolutists are marginal, for one. A world run by you people would be much worse for anyone who isn't you.
A world one by "those" people would lead to a less abusive and exploitive world, our current world is one based on suffering if you aren't extremely wealthy. I think I know which world I would rather join.
Today the median users threat model absolutely includes the government! They are snatching people up left and right, including their electronics.
I don’t get how people like you trust the corporation or the government that much. If we were all more cognizant of security and privacy, it would be much harder for large orgs to break our society the way they are doing today.
The median user would be better off in a society where computers are not needed for daily life. The median user doesn't understand computers. In their life, computers only manfiest as a tool of control imposed by the people who understand computers over those that don't.
This is one such example.
This sort of utilitarian nitpicking over the convenience of a "median" user is like maximizing the happiness of a cow on a factory farm. The cow would be better off if it did not exist at all. It is a matter of freedom and dignity.
My Linux drives are all encrypted, and one of the wonderful features of this is that there is no entity or force on this planet that can decrypt them.
What happens if I forget my keys? Same thing that happens if my computer gets struck by a meteor. New drive, new key, restore contents from backups.
It's simple, secure, set-and-forget, and absolutely nobody but me and your favored deity have any idea what's on my drives. Microsoft and the USGov don't have any business having access to my files, and it's completely theoretically impossible for them to gain access within the next few decades.
Don't use Windows. Use a secure operating system. Windows is not security for you, it's security for a hostile authoritarian government.
It's a good start, but FDE alone is still fairly easy to compromise in many cases. If you ever type the password under a camera, it may be leaked. If the device ever leaves your possession and you don't have secure boot, your bootloader can be trivially altered to leak the password. Then there are keyloggers. And cold boot attacks can often be done if your system is running.
Yeah, if the drive can be encrypted by an external party that you didn't give permission, I'm not sure how it's really "encryption" other than burning cycles when doing writes.
> there is no entity or force on this planet that can decrypt them.
At this point I think all of the modern, widely used symmetric cryptography that humans have invented will never be broken in practice, even by another more technologically advanced civilization.
On the asymmetric side, it's a different story. It seems like we were in a huge rush to standardize because we really needed to start PQ encrypting data in transit. All the lattice stuff still seems very green to me. I put P(catastrophic attack) at about 10% over the next decade.
You should also have several large random blobs with incriminating filenames on your hard drive. Attackers won't know which one is encrypted and which one is random. If you like, you can have an encrypted blob of decoy data next to your random blobs and your actually incriminating encrypted blob, and if you're duressed, you reveal that one as the real one.
I've been trying to get my parents to move, but until Microsoft Office desktop is able to be run natively on there my parents won't entertain the subject.
I've tried to get them to use the web version of office, I've tried to get them to use OnlyOffice and LibreOffice, I've even tried showing them LaTeX as a last ditch effort, but no, if it isn't true Microsoft Branded Office 2024, the topic isn't even worth discussing [1].
I'm sure there are technical reasons why Wine can't run Office 2024, and I am certainly not trying to criticize the wine developers at all, but until I can show Wine running full-fat MS Office, my parents will always "miss" Windows.
To be clear, I hate MS Office. I do not miss it on Linux. I'm pretty sure my parents could get by just fine with LibreOffice or OnlyOffice or Google Docs, but they won't hear it.
I've also tried to get them to use macOS, since that does have a full-fat MS Office, I've even offered to buy them Macbooks so they can't claim it's "too expensive", and they still won't hear it. I love my parents but they can be stubborn.
[1] Before you accuse me of pushing for "developer UI", LaTeX was not something I led with. I tried the more "normy-friendly" options first.
Your parents have a point. I've been switching most of my family's PCs to linux in the past few years and I miss Office. It is as easy to use as OnlyOffice and as powerful as LibreOffice for my tasks. There exists no equivalent on linux.
I recently helped my GF by proofreading something she wrote, which is a primarily Hebrew (RTL) Word document with English terms like units, numbers, and unpronouncable chemical names sprinkled in.
If I had a dollar for every time MS Word failed to correctly handle the BIDI mix and put things in the wrong order, despite me reapeatedly trying different ways to fix it, I'd be richer than Microsoft.
On the contrary, Google Docs, LibreOffice, and pretty much every text box outside of MS Office can effortlessly handle BIDI mixing, all thanks the Unicode Bidirectional Algorithm [1] being widely implemented ans standardized.
I use macOS most of the time, but switch to a Windows VM for Excel. Without the same keyboard shortcuts, the macOS version ends up having a fraction of the power available to experienced users of the Windows version. For people who use Excel extensively, LibreOffice or Google Sheets would have to offer some remarkable new killer features to make it worth the switch. I don’t think feature parity alone would make the benefits of Linux outweigh the significant transition costs.
They are like Vim. “Alt,letter,letter,arrow,letter,letter,arrow,enter”, etc. Rather than a single combination of keys, it is a series of key presses.
I agree that it might be trivial to set up for spreadsheets, and it would be really useful for other spreadsheets, and many other applications. I suppose a hurdle is how context sensitive the commands are depending on the cell or range of cells activated, and their contents and data type.
I mean, I think not having Copilot being shoved at you and not having advertisements pushed on you and having recovery tools that actually work and basically a lifetime of free updates would be a pretty big value add for Linux over Windows, and those go beyond feature parity.
Is your last name Segurakreischer?
Have them try - leave the Windows computer online and accessible, give your parents a linux box and have them use it exclusively unless they absolutely 100% need to get back on the Windows machine for some reason, and talk with you about it. Set up a NAS with an external HD and a shared folder on both the windows and linux box, so if they actually do need to go back to Windows, they aren't leaving anything stuck on the Linux box.
That's a 100% easy peasy safe mode, the worst they're likely to encounter is a brief 2 minute call with you, and in the worst case scenario, they get to go back to Windows without having to be scared of losing anything.
Afraid I don't get the reference if this is a joke, but no that is not my last name.
I've offered similar solutions to this; a VM that they can RDP into, or just a VM running locally with Winboat or Winapps so they could work with the apps they need to, but they won't entertain the idea.
Honestly I kind of think they're adding increasing conditions just so I stop bothering them about it. I think they very much do not want to change operating systems and they know that just saying that won't be a valid enough excuse to get my to shut up about it.
Before people give me shit over trying to force my dogma on them, I should point out that when their computers break (e.g. Windows Update decides to brick their computer), I am the one that is expected to fix them. I don't think it's unreasonable that if I'm expected to do the repairs on the computer that I get a say in what's installed on them.
Just remember, never use or recommend Debian-family(Ubuntu/Mint) or you will be back to windows. Do not fall for the marketing term Stable, which means outdated and contains bugs that are fixed.
Fedora is my recommendation. I remind people Fedora is not Arch. Fedora is a consumer grade OS that is so good, I don't lump it in with the word Linux.
I’ve tried multiple versions when trying to move away from windows, but was always stuck with random inconsistencies everywhere.
Eventually I had to choose a larger evil and choose Mac after paying for a week of lost productivity installing, setting up, fucking up, wiping l, installing random Linux distros.
Fedora is good and fairly stable, but it has bugged on me a few times.
In the past 3 years:
- mouse/cursor issues due to some kernel upgrade I think, as Fedora stays close to upstream
- unresponsive computer due to a bug in the AMD graphics driver
Both were easy to fix (kernel cmdline change or just kept updating my computer), and I absolutely recommend Fedora. That's what I'd use if I had servers. But, you'll probably have to debug _some_ issues if you use something less-used like AMD.
Once you've got a bit of savvy, do Arch. But if you're looking for "good" and "just works" and you don't want to tinker and/or occasionally scream at your computer in inchoate fury, Fedora is the way.
You can build your ideal fantasy setup piecewise, and I definitely recommend getting there, but Fedora is nice, and clean, and has plenty of "just works", and 99.999% of the problems you might run into, someone else has, too, and they wrote a treatise and tutorial on how to fix it and why it happened.
> Microsoft told Forbes that the company sometimes provides BitLocker recovery keys to authorities, having received an average of 20 such requests per year.
At least they are honest about it, but a good reason to switch over to linux. Particularly if you travel.
If microsoft is giving these keys out to the US government, they are almost certainly giving them to all other governments that request them.
That only strengthens the parent point. Switch to an OS where this requirement doesn't come into play if you're worried about any governments having a backdoor into your own machine.
Considering Windows's history with user consent I would be worried about the keys eventually being uploaded without asking the user and without linking online accounts.
Probably not now but not something unimaginable in some future.
However, since Windows can still run on user-controlled hardware (non-secure boot or VMs), I guess this kind of behavior could be checked for by intercepting communications before TLS encryption.
People know the system well enough to write FOSS implementations of it; I think they would have noticed and sounded the alarm if there were a possible master key.
I don't think anybody is interested in reverse-engineering closed-source OS to check if it works as documented; it;s easier to just use Linux which has open-source code.
If you sync your Linux machines key in the cloud, police could subpoena it too. The solution is not to switch to Linux, but to stop storing it in plain text in the cloud.
Thanks for the link, interesting article. The UK is among the worst in this regard.
Regarding the article's Apple example:
> The FBI eventually found a third party to break into the phone, but the tension between privacy and security remains unresolved.
This is actually quite resolved.
- Tech companies in the US are free to write secure encryption technologies without backdoors.
- Government is free to try to break it when they have valid legal authority.
- Tech companies are obligated to turn over information in their possession when given a legal warrant signed by a judge based on probable cause that a crime has occurred.
- Tech companies are not required to help hack into systems on the government's behalf.
As far as I'm concerned, in the US things are perfectly resolved, and quite well I think. It's the government and fear-mongers who constantly try to "unresolve" things.
I will never understand this from software engineers/tech people in general. That demographic knows how technology works, and are equipped to see exactly where and how Microsoft is taking advantage of them, and how the relationship is all take and zero give from their end. These people are also in the strongest position to switch to Linux.
The only explanation that makes sense to me is that there's an element of irrationality to it. Apple has a well known cult, but Microsoft might have one that's more subtle? Or maybe it's a reverse thing where they hate Linux for some equally irrational reasons? That one is harder to understand because Linux is just a kernel, not a corporation with a specific identity or spokesperson (except maybe Torvalds, but afaik he's well-regarded by everyone)
Microsoft is known for regularly altering the deal. Just because you configure the OS to not upload keys today, does not mean that setting will be respected in the future.
Because that gives you a lot more control over your computer than just solving this particular issue. If you care about privacy it's definitely a good idea.
you've baked in an unfounded assumption that bitlocker is even initially enabled intentionally by someone who knows that's a choice they can make:
> Here's what happens on your Dell computer:
> BitLocker turns on automatically when you first set up Windows 10 or Windows 11
> It works quietly in the background, you won't notice it's there
> Your computer creates a special recovery key (like a backup password) that's saved to your Microsoft account
> You might be reading this article because:
> Your computer is asking for a BitLocker recovery key
...such as after your laptop resets its tpm randomly which is often the first time many people learn their disk is encrypted and that there's a corresponding recovery key in their microsoft account for the data they are now unexpectedly locked out of.
Based on the comments in the thread, I sense I will be in the minority, but for most consumers this is a reasonable default. Broadly speaking, the threat model most users are concerned with doesn't account for their government. The previous default is no encryption at rest, which doesn't protect from the most common threats, like theft or tampering. With BitLocker on, a new risk for users is created: loss of access to their data because they don't have their recovery key. You are never forced to keep your recovery keys in Microsoft's servers and it's not a default for corporate users.
It's certainly a reasonable default. People lose or have their laptops stolen much more often than they get targeted by their governments.
Though that doesn't mean Microsoft couldn't implement a way of storing these keys so that they can't be accessed by Microsoft. Still better than nothing though.
I'll always remember - when I was first learning about it, one of the interesting counter-arguments to ignoring privacy was "what if the Nazis come back, would you want them to have your data?". I suppose there's some debate these days, but hostile governments seem a lot closer than they were 10-15 years ago.
Will this make people care? Probably not, but you never know.
Even in the best of times. Why widen your attack surface unnecessarily? Do you tell people your passwords and PINs at parties?
What governments and corporations (and plenty of bad actors in the FOSS world) have done is make this the default; made it easy to mindlessly hand people your privacy without even knowing. Opt-out, if you know the setting exists, and can find it.
I honestly love how HN is missing the forest for the trees, here, in the sense that ya’ll are upset Microsoft gave keys over for BitLocker to the feds but seemingly forget that Microsoft has been doing this in various forms since BitLocker released. Hell, they’ve given alphabet agencies tools that just pop the decryption in the field before, for intelligence work.
I trust BitLocker and Apple’s encryption to protect my stuff against snooping thieves, but I have never, ever assumed for a moment that it’d protect me against a nation-state, and neither should you. All the back-and-forth you see in the media is just what’s public drama, and a thin veil of what’s actually going on behind the scenes.
If there’s stuff you don’t want a nation state to see, it better be offline, on a OSS OS, encrypted with thoroughly audited and properly configured security tooling. Even then, you’re more likely to end up in jail for refusing to decrypt it [1][2].
Perhaps next time, an agent will copy the data, wipe the drive, and say they couldn't decrypt it. 10 years ago agents were charged for diverting a suspect's Bitcoin, I feel like the current leadership will demand a cut.
This is my biggest fear wrt gov't search-and-seizure. I know the police won't be able to get at my juicy encypted bits, but I also know they're vindictive basterds who'll be held to no accountability. Of course they'll wipe my drives just to get revenge for me "winning" by having blocked their access.
This is no different to Apple placing the encryption key for Filevault as plaintext on disk when it is turned off. Both companies make it easy for you to recover data in event of a catastrophe.
I see a lot of comments recommending TrueCrypt/VeraCrypt here, which is fine, but did you know there is something even more interesting? ;)
Shufflecake ( https://shufflecake.net/ ) is a "spiritual successor" to TrueCrypt/VeraCrypt but vastly improved: works at the block device level, supports any filesystem of choice, can manage many nested layers of secrecy concurrently in read/write, comes with a formal proof of security, and is blazing fast (so much, in fact, that exceeds performances of LUKS/dm-crypt/VeraCrypt in many scenarios, including SSD use).
Disclaimer: it is still a proof of concept, only runs on Linux, has no security audit yet. But there is a prototype for the "Holy Grail" of plausible deniability on the near future roadmap: a fully hidden Linux OS (boots a different Linux distro or Qubes container set depending on the password inserted at boot). Stay tuned!
I think most people don't understand that 99% of people don't know what data encryption is and definitely don't care about it. If it weren't for Bitlocker, their laptops wouldn't be encrypted at all! And of course if your software (Windows) encrypts by default but you don't want to bother the average user with the details (because they don't know anything about this or care about it) you will need to store the key in case they need it.
To everyone saying 'time to use Linux!'; recognize that if these people were using Linux, their laptops wouldn't be encrypted at all!
> If it weren't for Bitlocker, their laptops wouldn't be encrypted at all!
And because of Bitlocker, their encryption is worth nothing in the end.
> if these people were using Linux, their laptops wouldn't be encrypted
Maybe, maybe not. Ubuntu and Fedora both have FDE options in the installer. That's objectively more honest and secure than forcing a flawed default in my opinion.
> And because of Bitlocker, their encryption is worth nothing in the end.
No, it's worth exactly what it's meant for: in case your laptop gets stolen!
> flawed default
Look, in terms of flaws I would argue 'the government can for legal reasons request the key to decrypt my laptop' is pretty low down there. Again, we're dealing with the general populace here; if it's a choice between them getting locked out of their computer completely vs the government being able to decrypt their laptop this is clearly the better option. Those who actually care about privacy will setup FDE themselves, and everyone else gets safety in case their laptop gets stolen.
> No, it's worth exactly what it's meant for: in case your laptop gets stolen!
If my laptop gets stolen and it's worth something, the thief will wait until they can crack the management keys. We see this with corporate-locked laptops and Macbooks, iPhones and Androids, and other encrypted curiosities that get cracked at a lab in Tel Aviv for pennies on the dollar.
> Those who actually care about privacy will setup FDE themselves
This line is equivalent to forfeiting your position so I don't even know what to argue over anymore. I do care about privacy and I have no idea who you're arguing in-favor of.
> This is by far one of the best advertisements for LUKS/VeraCrypt I've ever seen.
LUKS isn't all rainbows and butterflies either [https://news.ycombinator.com/item?id=46708174]. This vulnerability has been known for years, and despite this, nothing has been done to address it.
Furthermore, if you believe that Microsoft products are inherently compromised and backdoored, running VeraCrypt instead of BitLocker on Windows likely won’t significantly improve your security. Implementing a VeraCrypt backdoor would be trivial for Microsoft.
Sadly VeraCrypt is not optimized for SSDs and has a massive performance impact compared to Bitlocker for full disk encryption because the SSD doesn't know what space is used/free with VeraCrypt.
Forgive me this shameless ad :) with the latest performance updates, Shufflecake ( https://shufflecake.net/ ) is blazing fast (so much, in fact, that exceeds performances of LUKS/dm-crypt/VeraCrypt in many scenarios, including SSD use.
VeraCrypt can be set to pass through TRIM. It just makes it really obvious which sectors are unused within your encrypted partition (they read back as 00 bytes)
Oh I did not know of this option, thanks! However, I was wrong about the reason for the performance loss on high speed SSDs and the issue is actually related to how VeraCrypt handles IRPs: https://github.com/veracrypt/VeraCrypt/issues/136#issuecomme...
The performance loss can be substantial on modern NVMe drives, up to 20 times slower. But I was wrong about the reason for the performance loss, it's not TRIM but how VeraCrypt handles I/O operations. You can see some numbers real numbers in this Github issue: https://github.com/veracrypt/VeraCrypt/issues/136
Remember when the original dev of TrueCrypt (the VeraCrypt predecessor) suddenly abandoned the project and wrote that people should use BitLocker instead? [1] [2]
We now know that BitLocker is not secure, and an intelligent open source dev saying that was probably knowingly not saying the truth.
The best explanation to me is that this was said under duress, because somebody wanted people to move away from the good TrueCrypt to something they could break.
alternatively, they knew truecrypt/veracrypt to be irrepairably compromised, and while bitlocker may be backdoored in the same way, it is at least maintained
In Apple's case, starting with macOS Tahoe, Filevault saves your recovery key to your iCloud Keychain [0]. iCloud Keychain is end-to-end encrypted, and so Apple doesn't have access to the key.
As a US company, it's certainly true that given a court order Apple would have to provide these keys to law enforcement. That's why getting the architecture right is so important. Also check out iCloud Advanced Data Protection for similar protections over the rest of your iCloud data.
As of macOS Tahoe, the FileVault key you (optionally) escrow with Apple is stored in the iCloud Keychain, which is cryptographically secured by HSM-backed, rate-limited protections.
Unbreakable phones are coming. We’ll have to decide who controls the cockpit: The captain? Or the cabin?
The security in iOS is not to designed make you safer, in the same way that cockpit security doesn't protect economy class from rogue pilots or business-class terrorists. Apple made this decision years ago, they're right there in Slide 5 of the Snowden PRISM disclosure. Today, Tim stands tall next to POTUS. Any preconceived principle that Apple might have once clung to is forfeit next to their financial reliance on American protectionism: https://www.cnbc.com/2025/09/05/trump-threatens-trade-probe-...
Of course Apple offers a similar feature. I know lots of people here are going to argue you should never share the key with a third party, but if Apple and Microsoft didn't offer key escrow they would be inundated with requests from ordinary users to unlock computers they have lost the key for. The average user does not understand the security model and is rarely going to store a recovery key at all, let alone safely.
Apple will escrow the key to allow decryption of the drive with your iCloud account if you want, much like Microsoft will optionally escrow your BitLocker drive encryption key with the equivalent Microsoft account feature. If I recall correctly it's the default option for FileVault on a new Mac too.
If they say they don't, and they do, then that's fraud, and they could be held liable for any damages that result. And, if word got out that they were defrauding customers, that would result in serious reputational damage to Apple (who uses their security practices as an industry differentiator) and possibly a significant customer shift away from them. They don't want that.
The government would never prosecute a company for fraud where that fraud consists of cooperating with the government after promising to a suspected criminal that they wouldn't.
That's not the scenario I was thinking of. There are other possibilities here, like providing a decryption key (even if by accident) to a criminal who's stolen a business's laptop, or if a business had made contractual promises to their customers, based on Apple's promises to them. The actions would be private (civil) ones, not criminal fraud prosecution.
Besides, Apple's lawyers aren't stupid enough to forget to carve out a law-enforcement demand exception.
Cooperating with law enforcement cannot be a fraud. Fraud is lying to get illegal gains. I think, it's legally ok to lie if the goal is to catch a criminal and help the government.
For example, in 20th century, an European manufacturer of encryption machines (Crypto AG [1]) made a backdoor at request of governments and never got punished - instead it got generous payments.
None of these really match the scenario we're discussing here. Some are typical big company stuff, some are technical edge cases, but none are "Apple lies about a fundamental security practice consistently and with malice"
That link you provided is a "conspiracy theory," even by the author's own admission. That article is also outdated; OCSP is as dead as a doornail (no doubt in part because it could be used for surveillance) and they fixed the cleartext transmission of hardware identifiers.
Are you expecting perfection here? Or are you just being argumentative?
> That link you provided is a "conspiracy theory," even by the author's own admission.
"Conspiracy theory" is not the same as a crazy, crackhead theory. See: Endward Snowden.
Full quote from the article:
> Mind you, this is definitionally a conspiracy theory; please don’t let the connotations of that phrase bias you, but please feel free to read this (and everything else on the internet) as critically as you wish.
> and they fixed the cleartext transmission of hardware identifiers
Have you got any links for that?
> Are you expecting perfection here? Or are you just being argumentative?
I expect basic things people should expect from a company promoting themselves as respecting privacy. And I don't expect them to be much worse than GNU/Linux in that respect (but they definitely are).
It was noted at the bottom of the article as a follow up.
> I expect basic things people should expect from a company promoting themselves as respecting privacy. And I don't expect them to be much worse than GNU/Linux in that respect (but they definitely are).
The problem with the word “basic” is that it’s entirely subjective. What you consider “basic,” others consider advanced. Plus the floor has shifted over the years as threat actors have become more knowledgeable, threats more sophisticated, and technologies advanced.
Finally, the comparison to Linux doesn’t make a lot of sense. Apple provides a solution of integrated hardware, OS, and services. Linux has a much smaller scope; it’s just a kernel. If you don’t operate services, then by definition, you don’t have any transmitted data to protect. Nevertheless, if you consider the software packages that distros package alongside that kernel, I would encourage you to peruse the CVE databases to see just how many security notices have been filed against them and which remain open. It’s not all sunshine and roses over in Linux land, and never has been.
At the end of the day, it's all about how you weigh the evidence. If those examples are sufficient to tip the scales for you, that's your choice. However, Apple's overall trustworthiness--particular when it comes to protecting people's sensitive data--remains high for in the market. Even the examples you posted aren't especially pertinent to that (except for iCloud Keychain, where the complaint isn't whether Apple is securely storing it, but the fact that it got transmitted to them in the first place, and there exists some unresolved ambiguity about whether it is appropriately deleted on demand).
Terrible security... compared to what? Some ideal state that exists in your head, or a real-world benchmark? Do you expect them to ignore lawful orders from governments as well?
> Apple's solution is iCloud Keychain which is E2E encrypted, so would not be revealed with a court order.
Nope. For this threat model, E2E is a complete joke when both E's are controlled by the third party. Apple could be compelled by the government to insert code in the client to upload your decrypted data to another endpoint they control, and you'd never know.
This is a wildly unrealistic viewpoint. This would assume that you somehow know the language of the client you’re building and have total knowledge over the entire codebase and can easily spot any sort of security issues or backdoors, assuming you’re using software that you yourself didn’t make (and even then).
This also completely disregards the history of vulnerability incidents like XZ Utils, the infected NPM packages of the month, and even for example CVEs that have been found to exist in Linux (a project with thousands of people working on it) for over a decade.
You're conflating two orthogonal threat models here.
Threat model A: I want to be secure against a government agency in my country using the ordinary judicial process to order engineers employed in my country to make technical modifications to products I use in order to spy on me specifically. Predicated on the (untrue in my personal case) idea that my life will be endangered if the government obtains my data.
Threat model B: I want to be secure against all nation state actors in the world who might ever try to surreptitiously backdoor any open source project that has ever existed.
I'm talking about threat model A. You're describing threat model B, and I don't disagree with you that fighting that is more or less futile.
Many open source projects are controlled by people who do not live in the US and are not US citizens. Someone in the US is completely immune to threat model A when they use those open source projects and build them directly from the source.
We're talking about a hypothetical scenario where a state actor getting the information encrypted by the E2E encryption puts your life or freedom in danger.
If that's you, yes, you absolutely shouldn't trust US corporations, and you should absolutely be auditing the source code. I seriously doubt that's you though, and it's certainly not me.
The sub-title from the original forbes article (linked in the first paragraph of TFA):
> But companies like Apple and Meta set up their systems so such a privacy violation isn’t possible.
...is completely utterly false. The journalist swallowed the marketing whole.
Okay, so yes I grant your point that people where governments are the threat model should be auditing source code.
I also grant that many things are possible (where the journalist says "isn't possible").
However, what remains true is that Microsoft appears to store this data in a manner that can be retrieved through "simple" warrants and legal processes, compared to Apple where these encryption keys are stored in a manner that would require code changes to accomplish.
These are fundamentally different in a legal framework and while it doesn't make Apple the most perfect amazing company ever, it shames Microsoft for not putting in the technical work to accomplish these basic barriers to retrieving data.
> retrieved through "simple" warrants and legal processes
The fact it requires an additional engineering step is not an impediment. The courts could not care less about the implementation details.
> compared to Apple where these encryption keys are stored in a manner that would require code changes to accomplish.
That code already exists at apple: the automated CSAM reporting apple does subverts their icloud E2E encryption. I'm not saying they shouldn't be doing that, it's just proof they can and already do effectively bypass their own E2E encryption.
A pedant might say "well that code only runs on the device, so it doesn't really bypass E2E". What that misses is that the code running on the device is under the complete and sole control of apple, not the device's owner. That code can do anything apple cares to make it do (or is ordered to do) with the decrypted data, including exfiltrating it, and the owner will never know.
> The courts could not care less about the implementation details
That's not really true in practice by all public evidence
> the automated CSAM reporting apple does
Apple does not have a CSAM reporting feature that scans photo libraries, it never rolled out. They only have a feature that can blur sexual content in Messages and warn the reader before viewing.
We can argue all day about this, but yeah - I guess it's true that your phone is closed source so literally everything you do is "under the complete and sole control of Apple."
That just sends you back to the first point and we can never win an argument if we disagree about the level the government might compel a company to produce data.
It's also the "default" in Windows 11 to require a recovery bitlocker key every time you do a minor modification to the "bios" like changing the boot order
I was going to say: "Well Apple historically is an easy target of Pegasus" but that can only be used a few times before Apple figures out the exploit and fixes it. Its more expensive than just asking the Apple.
But given PRISM, I'm sure Apple will just give it up.
> The hackers would still need physical access to the hard drives to use the stolen recovery keys.
Or remote access to the computer. Or access to an encrypted backup drive. Or remote access to a cloud backup of the drive. So no, physical access to the original hard drive is not necessarily a requirement to use the stolen recovery keys.
> ... The hackers would still need physical access to the hard drives to use the stolen recovery keys.
This is incorrect. A full disk image can easily obtained remotely, then mounted wherever the hacking is located. The host machine will happily ask for the Bitlocker key and make the data available.
This is a standard process for remote forensic image collection and can be accomplished surreptitiously with COTS.
I consider myself pretty pro-privacy, but there is so much dragnet surveillance and legitimate breaches of the fourth amendment that I have a hard time getting up in arms over a company complying with a valid search warrant that is scoped to three hard drives (and which required law enforcement to have physical possession of the drives to begin with).
This is so much more reasonable than (for example) all the EU chat control efforts that would let law enforcement ctrl+f on any so-called private message in the EU.
A lot of them are not really legitimate though. There's a reason that 4th amendment needs a modern version to require a warrant for tapping of any sort for things people generally assume are private. Flock, palantir, etc need to all go bankrupt, starved of data to spy on. In an ideal world of course. Maybe someday we'll wake up from the nightmare.
No one should be surprised by this. If you are doing anything on a computer and don’t want it to be readily available to governments or law enforcement you have to use Linux
I fully agree that this is disconcerting form a privacy standpoint,
and the danger it poses when Microsoft gets hacked.
As for it being user hostile.
I am pretty certain that thousands of users a year are delighted when something has gone wrong and they can recover their keys and data from the MS Cloud.
There should perhaps be a screen in a wizard,
Do you want your data encrypted?
y,n
If (yes)
Do you want to be able to recover your data if something bad happens?
(else it will be gone for ever, you can never ever access it again)
y/n
I think it is the kind of right place to ask: Is it possible to encrypt the system disc after Linux was installed or so I have to reinstall Linux for that purpose?
You should carry around a Ventoy stick with Debian/XFCE and perhaps Mint on it, and a 16 TB external disk, and nag people in your local environment to let you back up their stuff and move them off MICROS~1 operating systems.
Tell them you work in IT and that you'll make their computer faster and more secure. Don't mention Linux by name.
I have opted out of all cloud services in my windows installation; I use a passphrase, too (it is even before booting the computer). I feel like this is pretty safe
except MS could easily turn something on without you knowing and be uploading your files to their cloud. Yes, I believe they would stoop that low and even lower.
Companies like MS and Apple are telling their clients they offer a way to encrypt and secure their data but at best these claims are only half truths, mostly smoke and mirrors.
This is not OK. I don't want to get into legal parts of it, because I'm sure there's a fine print there that literally says it's smoke and mirrors, but it's despicable that these claims are made in the first place.
(2) the real need of ironclad encryption
I was born and raised in Eastern Europe. When I was a teenager it was common that police would stop me and ask me to show them contents of my backpack. Here you had two options - either (a) you'd show them the contents or (b) you would get beat up to a pulp and disclose the contents anyway.
It's at least 5h debate whether that's good or not, but in my mind, for 90% of cases if you're law abiding citizen you can simply unlock your phone and be done with that.
Sure, there are remaining 10% of use cases where you are a whistleblower, journalist or whatever and you want to retain whatever you have on your phone. But if you put yourself in that situation you'd better have a good understanding of the tech behind your wellbeing. Namely - use something else.
There was a great blog post a few years ago that reverse engineered the on-disk data structures and demonstrated extracting the key. Of course, I can't find it now.
Microsoft themselves [1] say:
> If a device uses only local accounts, then it remains unprotected even though the data is encrypted.
There is a further condition: if you explicitly enable bitlocker then the key is no longer stored on the disk and it is secure.
When I run "manage-bde -status" on my laptop it says "Key Protectors: None found". If the TPM was being used that would be listed.
Have you tried plugging the disk or ssd from your old laptop into another computer?
In the year of 2026, the rule of thumb is if you can get your work done without touching windows, then you should.
It goes without saying you should never trust any third party let alone a big corp.
Big shocker! Gotta love the collusion between government and big tech, it never ends, and our 4th amendment will ever be infringed through these loopholes -- and all will carry on not caring enough about it.
What quid pro quo? Is there an allegation that the FBI gave Microsoft something in exchange?
As far as I can see this particular case is a straightforward search warrant. A court absolutely has the power to compel Microsoft to hand over the keys.
The bigger question is why Microsoft has the recovery feature at all. But honestly I believe Microsoft cares so little about privacy and security that they would do it just to end the "help customers who lose their key" support tickets, with no shady government deal required. I'd want to see something more than speculation to convince me otherwise.
So, forcing user to connect to Internet and log in to Microsoft account has more to do than tracking you and selling ads -- Microsoft may be intentionally helping law enforcement unlocking your computer -- and that's not a conspiracy.
> Johns Hopkins professor and cryptography expert Matthew Green raised the potential scenario where malicious hackers compromise Microsoft’s cloud infrastructure — something that has happened several times in recent years — and get access to these recovery keys.
Bitlocker isn't serious security. What is the easiest solution for non-technical users? Does FDE duplicate Bitlocker's funcationality?
This is disappointing but I wonder if this is quid pro quo. Microsoft and Nadella want to appear to be cooperating with the government, so they are given more government contracts and so they don’t get regulatory problems (like on antitrust or whatever).
This isn't even about Microsoft or BitLocker. This is about the U.S.A.: anyone who thrusts the rule of law in the U.S. is a fool.
Yes, the American government retrieves these keys "legally". But so what? The American courts won't protect foreigners, even if they are heads of state or dictators. The American government routinely frees criminals (the ones that donate to Republicans) and persecutes lawful citizens (the ones that cause trouble to Republicans). The "rule of law" in the U.S. is a farce.
And this is not just about the U.S. Under the "five eyes" agreement, the governments of Canada, UK, Autralia and New Zealand could also grab your secrets.
Never trust the United States. We live in dangerous times. Ignore it at your own risk.
it's like microsoft has nothing better to do other than keep digging the hole to burry windows as mainstay operating system deeper and deeper with every new day.
Keys are stored securely in a TPM in the sense that a random program has no access to it. They are not stored safely there in the sense that they couldn’t possibly get destroyed. TPM hardware, or the motherboard that hosts it, occasionally fails. Or you might want to migrate your physical hard drive to a different PC. That’s the purpose of backing up the keys to the cloud. Alternatively, you can write down a recovery key and put it in your safe. Personally, I put it in my password vault that also happens to be backed up to the cloud (though not Microsoft’s).
There's also no security in the communication between the CPU and the TPM, so you can plug in a chip that intercepts it and copies all the keys, or plug the TPM into a chip that pretends to be the CPU and derives identical keys.
The TPM on most computers these days is a sectioned off part of the CPU that only talks through channels on the package/die (fTPM). Good luck plugging something in on that.
> Not a reason to violate privacy IMO, especially when at the time this was done these people were only suspected of fraud, not convicted.
Well you can't really wait until the conviction to collect evidence in a criminal trial.
There are several stages that law enforcement must go through to get a warrant like this. The police didn't literally phone up Microsoft and ask for the keys to someone's laptop on a hunch. They had to have already confiscated the laptop, which means they had to have collected enough early evidence to prove suspicion and get a judge to sign off and so on.
They had a warrant. That's enough. Nobody at Microsoft is going to be willing to go to jail for contempt to protect fraudsters grifting off of the public taxpayer. Would you?
Your firmware and UEFI likely accept MS keys even if you supplied your own for Secure Boot. Sometimes the keys are unable to be removed, or they'll appear "removed" but still present because losing the keys could break firmware updates/option ROMs/etc.
Similarly, your TPM is protected by keys Intel or AMD can give anyone.
If you want to extrapolate, your Yubikey was supplied by an American company with big contracts to supply government with their products. Since it's closed source and you can't verify what it runs, a similar thing could possibly happen with your smartcard/GPG/pass keys.
This is why the FBI can compel Microsoft to provide the keys. It's possible, perhaps even likely, that the suspect didn't even know they had an encrypted laptop. Journalists love the "Microsoft gave" framing because it makes Microsoft sound like they're handing these out because they like the cops, but that's not how it works. If your company has data that the police want and they can get a warrant, you have no choice but to give it to them.
This makes the privacy purists angry, but in my opinion it's the reasonable default for the average computer user. It protects their data in the event that someone steals the laptop, but still allows them to recover their own data later from the hard drive.
Any power users who prefer their own key management should follow the steps to enable Bitlocker without uploading keys to a connected Microsoft account.
reply