June 18, 2011, 2:46 p.m.
posted by vdv
File Encryption Functional Description
There's no getting around it. File encryption is a complex beast. EFS uses technologies that are only just now entering their maturity. You need to plan deployment carefully to ensure that files remain secure and available. EFS also requires a measure of insight and work from users, not always a key to success.
File Encryption Process Description
File encryption uses elements from a variety of Windows subsystems. To introduce the cast of characters, here is a summary of what happens when a file is encrypted. See Figure for a block diagram of the components:
The end result is an NTFS file with gibberish in the $Data attribute that can only be understood when decrypted using the FEK. Only the user who encrypted the file or the DRA can decrypt the file because only they have FEKs in the file record. (As we'll see, Windows Server 2003 and XP have a new feature that permits giving additional users access to the file.)
File Decryption Process Description
Here's a quick rundown of what the system does to decrypt an encrypted file when a user opens it (see Figure):
When a user encrypts a file, the EFS driver works in concert with the NTFS driver in the Windows Executive to encrypt the $Data attribute of the file. Figure shows a diagram of the Master File Table (MFT) record before and after encryption.
EFS converts the $Data attribute into a series of cipher blocks that it stores in the same location on the disk where the clear-text file originally resided. (If the $Data attribute is resident in the MFT record, it is made non-resident and placed out onto the disk.)
During this encryption transaction, a clear-text copy of the file is created. You can avoid this by setting an encryption flag on a folder and then saving files directly into the encrypted folder. This encrypts the data stream before it hits the disk and avoids temp files. See the following sidebar, "Wiping EFS Temp Files."
Data Encryption Algorithms
For many years, the only exportable encryption option was the Data Encryption Standard, or DES. Developed by IBM back in the 1970s, DES has just not been able to keep up with modern computing capabilities. A moderately priced machine can now break standard DES encryption handily.
To overcome this weakness, EFS uses one of two encryption mechanisms that are more secure than DES while retaining its exportability and most of its speed: DESX or 3DES (Triple-DES).
This is the encryption mechanism used by Windows 2000. DESX was formulated by Ronald Rivest at RSA Data Security in an effort to shore up DES by making it more resistant to brute-force dictionary attacks. In broad terms, DESX is a block cipher that uses a three-step process to encrypt a file:
The combination of these three steps results in an encrypted file that is much less susceptible to key-search attack than DES. Unfortunately, DESX is not much better than standard DES when it comes to resisting sophisticated cryptanalysis.
So, the basis of 3DES is to simply run the data through the engine three times, each time using a different one-third of a big master encryption key. The result is a cipher block that is a billion times more secure than DES alone, but performance is three times slower. DES isn't all that slow, though, and 3DES is still faster than many other alternatives.
A body called the National Institute of Standards and Technology, or NIST, decides on procedures and practices that companies must follow to do business with the government. Because everyone wants to do business with the government, many NIST recommendations become de-facto standards for industry.
NIST standards are published in a set of Federal Information Processing Standards (FIPS) publications. The FIPS publication that describes general requirements for computer security is FIPS-140-2, Security Requirements for Cryptographic Modules. This standard is also used by the Commerce Department to determine technologies that can be freely exported and those that require special handling.
Although 3DES is strong, it is subject to certain cryptanalytic attacks. To be fully FIPS-140-2 compliant, Microsoft changed the default EFS encryption algorithm in Windows Server 2003 and XP SP1 to the Advanced Encryption Standard (AES). This can cause interoperability problems with existing Windows 2000 and XP computers that still use DESX. You cannot transport an encrypted file from Windows Server 2003 to a Windows 2000 server and do a data recovery.
For backward compatibility, you can set a group policy to force Windows Server 2003 and XP SP1 to use DESX rather than AES until you have completed your migration. The group policy that controls encryption is System cryptography: Use FIPS compliant algorithms for encryption. The policy is located under Computer Configuration | Windows Settings | Security Settings | Security Options.
File Encryption Key Protection
DESX, 3DES, and AES are symmetrical algorithms, meaning that the same cipher key is used for encryption and decryption. Longer keys are better than shorter ones, and the Microsoft Crypto Provider takes advantage of the loosening of export restrictions in 2000 to use a 128-bit random number as a cipher key. This is called the File Encryption Key.
Ordinarily, you would expect a security system to treat a cipher key like you would treat your house key. You do not leave your house key near your front door where someone could easily find it and break in. Unfortunately, the FEK must accompany a file because of portability considerations. For instance, if you do a tape restore of encrypted files to another location, you still want the capability of opening the files.
So, the FEK must accompany a file. But it must be adequately protected or else the entire encryption scheme would be compromised. What's needed is a separate encryption mechanism to protect the FEK. That's where Public Key Cryptography Services (PKCS) come into play.
PKCS is a system of public/private key pairs. Anything encrypted with one key in a public/private key pair can only be decrypted with the other key. EFS encrypts the FEK with the user's public key. Only the user's private key can decrypt the FEK, which is necessary to decrypt the file. The user's private key, then, is the sweet spot in making EFS work.
The private key is stored in the user's profile. The file holding the key is a hidden, system file. This keeps it from being encrypted by the Encrypting File System.
EFS does not use PKCS for encrypting an entire file because of poor performance. The encryption methods used by PKCS are impressively secure but very slow and processor-intensive. The combination of a moderately secure bulk file encryption mechanism and a highly secure PKCS key protection mechanism is fairly standard in the industry. S/MIME (Secure Multipurpose Internet Mail Extensions) uses it to protect email. IPSec (IP Security) uses it to protect data communications.
The data structure used to hold the encrypted FEK is called a Data Decryption Field, or DDF. The DDF also contains a copy of the user's public key certificate for identification. The DDF is stored along with the file in a special NTFS attribute called $Logged_Utility_Stream.
EFS also makes provision for accessing a file if the user accidentally deletes the private key. When EFS encrypts a file, it includes another copy of the FEK encrypted with a File Recovery public key issued to the domain Administrator account. This account is called a Data Recovery Agent, or DRA. You can use the DRA's private key to decrypt its copy of the FEK, which then gives access to the file.
EFS stores the DRA's copy of the FEK in a data structure called a Data Recovery Field, or DRF, along with a copy of the DRA's public key certificate. The DRF is stored in its own $Logged_Utility_Stream attribute in the file record.
Encrypted File Sharing
Windows .NET/XP permits sharing access to encrypted files. It does this by encrypting another copy of the FEK with the public EFS key of any selected users. This gives the users the ability to decrypt the file, assuming that they have sufficient NTFS permissions to open it in the first place.
The user who first encrypts a file selects additional users to share the file via the file's Properties window. Click Advanced to see the encryption option, then click Details to see the list of users and DRAs assigned to the file. Figure shows an example.
Figure. EFS Details for a file showing the users selected to access the encrypted file and the DRAs assigned to the file.
When a name is added to the list of authorized users for an encrypted file, EFS obtains the user's public key certificate either from the local certificate store (the Registry) or from Active Directory.
Any user with access to an encrypted file can add other names to the list of authorized users. Train your users to grant shared access only when absolutely necessary and then only to trusted colleagues.
The Resource Kit includes a command-line tool called EFSINFO that can quickly display the name of the user who encrypted a file, any users who have been added to the file, and the DRA(s), if any, assigned to the file. Here is an example EFSINFO listing for a file on a standalone server without a local DRA:
E:\>efsinfo /u /r /c test.txt test.txt: Encrypted Users who can decrypt: CX612097-B\Admin (Admin([email protected])) Certificate thumbprint: 839E 9492 CF52 C280 6BB3 5EF5 1C60 B38A C464 5E8C Unknown (Bill([email protected])) Certificate thumbprint: BEB0 795E 3F50 F36B 6357 C660 1164 C707 52F8 666B No recovery agent is found.
The thumbprint information comes in handy if you have a user with multiple EFS certificates. You can quickly identify which certificate was used to encrypt the file. You can find out the user's current certificate using cipher /y. Here is a sample listing:
E:\>efsinfo /y Your current EFS certificate thumbnail information on the PC named localbox is: 839E 9492 CF52 C280 6BB3 5EF5 1C60 B38A C464 5E8C
Private Key Protection
Have you ever seen the old 1970's television series Get Smart? At the start of each episode, Maxwell Smart (played by Don Adams) walks down a long corridor through a series of doors that leads to a phone booth where he dials a number and drops through a trapdoor. That pretty much sums up the way a user's private key is protected, one key after another key after another. Ready to see them in action? Here goes. . . .
The private key is stored in the user's profile under \Application Data\Microsoft\ Crypto\RSA\<User_SID>. To protect the private key, the Microsoft Crypto Provider uses a Session key generated by the Data Protection API, or DPAPI.
The DPAPI generates this Session Key using a secret derived from the user's password hash. Details of how this secret is generated can be obtained from www.microsoft.com/serviceproviders/whitepapers/security.asp. In essence, the user's standard logon password hash is hashed again, this time using the 160-bit SHA-1 algorithm. It is then run through 4000 iterations of a password protection algorithm, PBKDF2, described in a document titled PKCS #5, "Password-Based Cryptography Standard." (See www.rsalabs.com for more information.) This creates a pseudo-random number based on the user's password hash. DPAPI refers to this as a Master key.
The Master Key is itself encrypted using a special function called an HMAC (or Hashed-based Message Authentication Code). An HMAC incorporates a secret key into a standard hashing function. The HMAC used to encrypt the Master key is a hash of the Master key itself with the user's password hash as the secret key. The encrypted Master key is stored in the user's profile under \Application Data\Microsoft\ Protect\<User_SID>. So, as you can see, the reliability of the entire structure comes down to the security of the user's password. If I know your password, I can get access to your encrypted files. Simple as that. You can avoid this vulnerability by using smart card authentication. See Chapter 20, "Managing Remote Access and Internet Routing," for details about deploying smart cards.
When a user changes her password, DPAPI builds a new Master key and then uses it to generate a new Session key to re-encrypt the private keys when the user next logs on. If you as an administrator reset a user's password in Active Directory, DPAPI will build a new Master key the first time the user logs on with the new password. DPAPI then generates a new Session key derived from the new Master key and uses it to re-encrypt the private keys.
For machines that are members of a Windows Server 2003 or Windows 2000 domain, a backup copy of the Master key is stored at the local machine. This backup copy uses a public key stored in Active Directory as the secret key for the HMAC used to encrypt the Master key. This permits Master key recovery should the local copy get corrupted.
Private Key Protection on Standalone Machines
In a domain environment, the user password hash is stored in Active Directory. A copy of the hash is cached locally in the Registry under the Secrets key in the Security hive (often called LSA Secrets) but this copy cannot be changed as long as the user is off the network.
The same is not true for standalone machines, where the user password hash is stored in the local SAM. If a bad guy can get physical access to a standalone machine, he can use one of several utilities to change the user passwords stored in the SAM, including the Administrator password. Because the local Administrator account is the DRA for file encryption on a standalone machine, if you can log on as the Administrator, you stand a chance of compromising the user's encrypted files.
At least this was the situation in Windows 2000. The operation of the DPAPI has been changed in Windows Server 2003 and XP to eliminate this security flaw. In Windows Server 2003 and XP, if a user's password on a standalone machine is changed by any means other than one that includes DPAPI function calls, the Master key is not re-encrypted. Without access to the Master key, EFS cannot access encrypted files on behalf of the user. This keeps encrypted files safe from interlopers.
Windows Server 2003 and XP also have a new feature called a Password Reset Disk, or PRD. The PRD permits a user who has forgotten his password to change it at the logon window. DPAPI does not update the Master key if the user changes his password using the PRD. This means that the user loses access to encrypted files, which is especially nasty on XP because there is no requirement to have a DRA.
There is a workaround, however. The user can log on with the new password and then change the password back to what it was, if he can remember it. This will cause DPAPI to create a new Master key with the new user credentials. The Session key derived from this Master key will decrypt the user's private EFS key, giving the user access to encrypted files once again.
This change to the DPAPI does not prevent a bad guy from changing the Administrator account password then logging on and poking around trying to find clues for the original user passwords. For example, the bad guy might use a password cracker like L0phtcrack to derive the user passwords from the hashes stored in the SAM. It's possible that the user of the standalone laptop who was so concerned about protecting files with encryption was not as concerned about access security and used a simple password, or maybe even a blank password, for logon.
For this reason, users on standalone machines who want to encrypt files should always use long, complex passwords and should never save password hints or save dial-up passwords or save any application password if it is the same as their logon password. This potential vulnerability is only present on standalone machines.
EFS Key Storage
When a user first encrypts a file, the system obtains a set of EFS keys issued in the form of certificates. PKCS certificates are normally issued for specific reasons. In this instance, the certificate is issued for the express purpose of supporting the Encrypting File System. An EFS certificate cannot be used to digitally sign an email message, for instance.
If there is no Certification Authority (CA) available, the Microsoft Crypto Provider on the local machine issues the EFS certificate. This is called a self-signed certificate, meaning that it has no chain of authority leading to a CA. Self-signed certificates have security significance only on the local machine because other machines cannot check validity of the issuer.
If you have a Microsoft CA, a group policy informs the local clients of the CA's existence by distributing the CA's public key certificate. Clients use this public key to encrypt certificate requests that they send to the CA. When the CA gets a certificate request, it decrypts it with its private key, verifies the requestor's identity, then issues the certificate. It digitally signs the certificate with its own private key so the client can validate the certificate with the CA's public key.
After they are obtained, the EFS keys are stored in the user's local profile under a hidden folder called Application Data. Figure shows the key locations.
It is very important to train desktop technicians to treat profiles carefully after you deploy file encryption. If you delete a user's profile, the user loses access to all encrypted files on that machine.
As we'll see a bit later when we look at data recovery, the domain Administrator account is the default Data Recovery Agent for all member servers and desktops in a domain. A profile is created for this account on the first domain controller in the domain. The profile contains the File Recovery (FR) public and private keys used by the DRA. If you delete the Administrator profile on the first domain controller in the domain, you cannot recover any encrypted files in the entire domain.
You can and should export the Administrator's FR private key to a certificate so it can be safely stored away and used only when necessary. You can also designate other DRAs to use in addition to the domain Administrator.
EFS Keys and Roaming Profiles
The Application Data folder travels with a user who has a roaming profile. This is good news, because it means the same public key is used to encrypt files on different machines. This simplifies aggregating the files for the user, should that become necessary.
The bad news is that you cannot encrypt files within a roaming profile because of a Catch 22 in file handling. Let's say that you were able to encrypt a file in the My Documents folder of a roaming profile. When you log off, that encrypted file would be copied to a server.
When you log on the next time, the system downloads the contents of your roaming profile then uses it to initialize your local operating environment. But wait, there's an encrypted file in that roaming profile, and you can't open that encrypted file without your private key; but your private key is in your profile, which the system won't initialize until it has copied all the contents.
To avoid this situation, Microsoft prevents users from storing encrypting in roaming profiles. If a user does encrypt a file inside a profile, the system gives an error at logoff stating that the file will not be copied to the server.
Users with roaming profiles who want to encrypt files in My Documents can do so this way: use folder redirection to place the My Documents folder on a server, enable offline folders, then encrypt the contents of the offline folder cache.