Saturday, November 29, 2014

Build a low budget computer meant for 24/7 operation.



Sometimes you need a machine you can trust to be on for many days at a time. But you're not looking to invest into a server grade build, nor do you need exceptional performance. Let's say it's supposed to run a script or a process, constantly. For example - a DVR server.
In this case you need to optimize your configuration for durability. So to start - we need a nice ventilated chassis. If budget is not unlimited, then a basic well ventilated chassis will do.

Motherboard - We need the motherboard as cheap and as durable for the price as possible. So a good idea is to look through motherboard vendor lineups - go to the cheapest model and then raise the bar till you reach a model with all solid capacitors. For example - currently (end of 2014) Asus has the H81 series motherboards, and one of the cheapest is H81M-K. It however does not have all solid capacitors, so if we take the next model, right above this one - it will be H81M-A - which does offer an all solid capacitor configuration, and is only slightly more expensive.

CPU - if you're not looking for high performance, I would suggest a simple Intel CPU, such as currently Haswell based Pentium Dual core series. They are cheap and bear the Intel quality standard.
You can also go with a T series Intel CPU which consumes the least power.

RAM - again something that is of acceptable quality but not gamer grade. A built in heat-sink is always a plus. As an example - Corsair CMV4GX3M1A1600C11

Hard Drive - This is where you need to invest a bit more into a drive made for 24/7 operation, such as Western Digital Caviar Red made for RAID, NAS and 24 hour operation. Similar models from other manufacturers include Seagate Surveillance HDD or Toshiba High Durability Specialty drives.

PSU - A great way to provide stable power to a long running machine is to go with a good power supply unit that you can trust. Some even come with a built in surge protector. Also make sure you get a PSU with an 80Plus rating, which means it wastes less power. A good example of an affordable 80Plus PSU is Corsair VS350


Tuesday, September 23, 2014

Asus H81M-K network disconnections - solved.

You are experiencing random Ethernet problems on a machine based on this motherboard, i.e - disconnections (cable unplugged even though it's plugged in, cable not detected as plugged in), limited connectivity, or sometimes even the connection seems stable (network cable recognized, IP address is correct) - there are still internet connection or LAN connection issues. The reason may be related the faulty NIC driver that comes with this motherboard.

The best way to stabilize your on-board NIC (Realtek PCIe GBE Family Controller) performance is to downloaded the latest driver from Asus, and after it's installed do the following:


  • Go into Device Manager and double click your NIC card in the list of devices.
  • In the Power Management tab - untick the "Allow the computer to turn off this device to save power" option.
  • In the Advanced tab - Disable "Energy Effecient Ethernet" and "Green Ethernet"


This works for other motherboards that share the Realtek PCIe GBE Family Controller.

Saturday, September 20, 2014

Optimize QNAP NAS security

In this post I will talk about tweaking your QNAP NAS device to be more secure than it's configured by default.
I'm going to assume that the main settings of user and folder structure have been configured at this point, and you're good to go, but you'd like to maximize security for the NAS using the built in features.

As a side note - I am using QOS firmware version 4.1.0 to demonstrate the features depicted in this post.

Let's start by enabling SSL for the web management console of the NAS.


You can force secure connection so there would be no other way but to connect securely to the web console, but if something goes wrong (with the SSL certificate) - you may not be able to log onto the web console. So pick the right option for yourself.
By default, the NAS will use a self generated SSL certificate, it is also possible to upload and utilize a third party SSL certificate, which can come from a verified provider.
The certificate and the private key both can be uploaded inside the Security > Certificate & Private Key tab.

Now we can set up email alerting. I believe that email alerts are a very practical way of staying up to date on server's health and security issues. This can be done under the Notification section, First set up the SMTP server, I suggest using Gmail, as its configuration is already built into the console, and Gmail provides a very stable and free email service, so if you don't already use Gmail, you may want to create an account specifically for this server alerting system. Put in your Gmail address and password, and hit the "Send a Test Email" button to see if you can receive email alerts from the server. If you can't - your network hardware maybe blocking it. so it's something you should look into. Open outbound SMTP SSL ports in your router/gateway, for example.
Once the test email arrives, you need to set the server to send you email alerts to your preferred email address, this is done in the Alert Notification tab.
Check the "Send system error alert by: Email" and "Send system warning alert by: Email", and make sure to enter your target email address(es) under "Email Notification Settings". If you don't set this up - you will not receive alerts to your email address.
Once done, hit the Apply All button on the bottom of the page.

Now, I should mention that if your NAS server is planned to be accessible remotely - be it via VPN, FTP, or you would simply like to remotely administer it using the web console - there will be constant hack attempts coming from the Internet. Most of the time these are not targeted at your server specifically but rather automated by certain malicious machines online to try and guess the username and password of a NAS or any other protected machine that is accessible remotely. So to get rid of that potential threat to the data and stability of your NAS we need to set the server to automatically ban or block IPs that are trying to hack your device. The feature that controls this is located under Network Access Protection inside the Security section of the web console.


First tick the box "Enable Network Access Protection", and then configure protocols for which the server will monitor access attempts and react accordingly.
I usually prefer setting the connection methods shown in the screen-grab. Note that SAMBA and AFP are not monitored as they are local connection types and may interfere with your users' access stability. You don't have to block the offending IPs forever, you can just ban their access temporarily, but I see no reason to do that as they will continue right on trying to penetrate your network after they are unblocked.
Once done, make sure to hit Apply or Apply All.
This feature was introduced after version 3.8.0 so if your NAS is running an older firmware version - maybe it is time to update.

Another important issue to consider is evaluating the connection methods that are open in your NAS.
If you don't plan to administer the NAS via SSH - do turn it off, because a lot of hack attempts will come via SSH. SSH is enabled by default, so it may be a good idea to turn it off to raise the level of security.
To disable SSH and/or FTP, go into the Network Services section and disable the unnecessary connection methods.

Finally, you may want to enable logging of file usage on your NAS. This can also tell you about internal users' actions in detail, as well as log incoming hack attempts. This is done in System Connection Logs under System Logs. Click the Options button and check all of the connection methods relevant to your situation. Make sure to check SAMBA to monitor the local users' connections. Once your log fills up to 10,000 events - you can automatically dump it into a CSV file onto one of the shared folders. You can create a protected log folder that only you as the admin have the access to, and point the CSV file creation there.

It is worth mentioning that QNAP NAS comes with an internal antivirus feature that is disabled by default, so if you want the NAS to scan the files that it hosts you can enable the Antivirus application (located under the Applications section on the bottom of the console), you can schedule scan jobs and automatic definition updates here as well.

Thursday, September 4, 2014

New USB devices or existing devices stop working in Windows 7/8

Recently I started noticing a widespread problem with certain machines not being able to accept new USB devices or they would have certain existing devices stop working (driver-wise). This may be related to a certain PUP called SafetyNut that is a subapplication that comes with Ask.fm toolbar.

If you are unable to use any new USB devices that your computer hasn't recognized before, or suddenly things like your lan or w-lan card stop working and have a driver problem inside the device manager - it may be due to the aforementioned problem.

The simple way to resolve this is to uninstall the Ask.fm toolbar.

A more thorough solution would be to run a malware cleaner, and disable the SafetyNut service (yes it actually runs as a service, which is recognized as a malicious service), clean the discovered PUPs and uninstall the Ask.fm toolbar.

Friday, August 29, 2014

Windows Update error 80246002



This error started appearing in Windows 7 a few days ago, some say due to the update KB2982791 (unconfirmed yet).

If you were affected by this error - here's how to fix it (so far).

First of all - what doesn't work:

  • Uninstalling the KB2982791 update.
  • Deleting the SoftwareDistribution folder contents inside the Windows folder (this used to work for me in previous Windows Update errors).
  • Renaming the Download subfolder of the SoftwareDistribution folder.
  • Restoring the system to an earlier state may not fix this problem.
  • Running the Windows Update Troubleshooter may not fix this problem.
  • Applying the KB947821 patch may not fix this problem.

What does work (confirmed by many cases, including my own experience with this error):

You need to manually set the DNS addresses (of a well known public DNS server, such as Google DNS) for your network connection to successfully get rid of the update error. 

1. Go to Network and Sharing center inside Control Panel.
2. Click "Change Adapter Settings" on the left side pane. 
3. Right click the network connection you're currently using to be connected to the internet.
4. Choose Properties.
5. Double click "Internet Protocol Version 4 (Tcp/IPv4)"
6. Change "Obtain DNS Server Address automatically" to "Use the following DNS server addresses" - enter 8.8.8.8 as Preferred DNS server, and 8.8.4.4 as Alternate DNS server.
7. OK all of the windows. 
8. Now either restart the machine and check your updates again, or run the ipconfig /flushdns command inside the command prompt in elevated mode.


After this procedure - Windows Update should start searching for updates, it will take a while, so be patient. Eventually you will be presented either with more updates, or the statement that there are no newer updates.



Friday, August 1, 2014

OneDrive for Business doesn't sync - solved.



Is your OneDrive folder not syncing anymore? Do you get sync errors that won't go away? Do all of the files in the sync folder have a red X on them? If so - here are the answers.

Sometimes you copy a large amount of data into your OneDrive sync folder or you make a drastic change in the folder structure (taking all of the main folders and putting them into one subfolder, for example) - this may lead to major sync problems.

What doesn't work:

Stopping - restarting sync, syncing a new folder, deleting everything in your sync folder and then hoping it would resync, deleting everything on the server and hoping it would resync from your pc folder, running the "repair" function of the OneDrive desktop software, uninstalling and reinstalling OneDrive.

If you've done all of the above and still no luck, read on.
The only solution I've found so far is doing the procedure below (recommended by MS support team) and it actually works.

Important! Make sure that your server-side copy of your files is up to date, because you will lose all of the local data from your computer's Sync folder and your computer will then resync from the server.
If it's not up to date, then backup the data in your sync folder, just in case.

1. Right click on the Windows status bar, select Start Task Manager, and get a list of running processes as follows:
If you’re running Windows 7: Select the Processes tab.If you’re running Windows 8: Select More Details in the bottom left and then select the Details tab.

2. Verify that none of the following processes are running:
groove.exe
msosync.exe
msouc.exe
winword.exe
excel.exe
powerpnt.exe

3. If any of the above processes are running, stop the process by right-clicking it and selecting End Process.

4. Give yourself an administrator role as follows:If you’re running Windows 7: Click the start button and type cmd in the search bar. When cmd.exe appears in the results, right-click it and select Run as Administrator.If you’re running Windows 8: Drag your mouse to the bottom left corner of the screen, right click on the Start icon that appears, and then select Command Prompt (Admin).

5. At the command prompt, delete the Office file cache and Spw folders by issuing four commands as follows:
a. Type this and then press the Enter key:
cd %USERPROFILE%\AppData\Local\Microsoft\Office\15.0\ 
b. Type this and then press the Enter key:
rmdir OfficeFileCache /s 
c. Type this and then press the Enter key:
cd %USERPROFILE%\AppData\Local\Microsoft\Office\  
d. Type this and then press the Enter key:
rmdir Spw /s

6. If you get an error when executing either rmdir command (steps b or d), one of the .exe processes is probably still running. Correct the problem by returning to the Task Manager (step 2), stopping the processes, and then removing the directories as described previously.

7. Start the OneDrive for Business client and re-sync the library.


Thursday, July 24, 2014

Sync files remotely to a QNAP NAS


Sync your files remotely to a QNAP drive, wherever you are.

I work a lot with QNAP NAS drives and I enjoy the stable remote access that they provide, be it remote administration and management, or data access.
Here I will be talking about setting up your very own cloud service on your QNAP NAS drive, that will function just like Google Drive or MS Onedrive. It does take some major setting up to do, but after that - things should run smoothly.

In this scenario user data has to be available at all times and synced whenever there's an internet connection available, all data goes to the QNAP drive remotely (this provides backup and data administration by others).
We need our remote folders to be able to let us use Offline Files. So we need to set up a VPN service on the QNAP. If not the use of Offline Files (and the whole point of syncing files that are always available) - it would have been easier to simply run an FTP service from the QNAP drive, which requires less effort. But the Offline Files feature would demand that the QNAP folders be available as if we're accessing them through a LAN - hence the VPN service.

Official QNAP VPN setup instructions are found HERE.

QNAP server-side settings:

  • Inside the QNAP web console, head over to Applications > VPN Service.
  • You should first forward the necessary VPN ports using the Auto Router Configuration.
  • QNAP provides a kind of DDNS called MyQnapCloud which gives you a free dynamic DNS if your server's public IP isn't static. You can also configure the VPN in an easy way using the myQNAPcloud feature inside the Qsync under Network Services.
  • Inside Applications>VPN Service> VPN Server Settings - Check the "Enable PPTP VPN server"
  • Important! Under the "VPN Client IP Pool" set the IP range, make sure you use a unique IP range not commonly used because you wouldn't want the VPN IP to clash with the local network IP of the client computer.

Client-side (Windows 7/Windows 8/Windows 8.1) settings:
  • You can install the myQNAPcloud Connect utility which will build the VPN dialup connection for you, or you can manually set it up yourself. I simply installed the utility to let it setup the VPN connection and then removed the utility from start up. The utility is simple, however it will not automatically connect your VPN if you're on a wifi network, and it will not automatically reconnect if the internet connection was disrupted. We are going to fix this flaw by doing some manual settings and powershell scripting.
  • In case you decide to build the VPN dialup connection yourself - here are the settings:
1. Inside Network and Sharing Center click "Set up a new connection or Network",
2. Choose "Connect to a workplace"
3. Choose "Use my Internet connection (VPN)"
4. Inside Internet address - insert the public IP of the network on which your NAS is located. Or if you have a DDNS by QNAP or third party - put it here. 
5. Check "Don't connect now..."
6. Insert your VPN username and password (usually it's the main NAS admin by default, but you can add VPN users inside the QNAP VPN Service page inside the web console)
7. Click Create but don't connect yet. Click Close.
8. Inside Network and Sharing center click on "Change Adapter settings".
9. Right click your newly created VPN connection icon and choose Properties.
10. Under the Security tab - make sure Type of VPN is PPTP, Data Encrytpion is set to Optional, and protocols PAP, CHAP and MS-CHAP v2 are enabled. 
11. Under the Networking tab - make sure that only IPv4 is checked, double click on it, choose Advanced, and uncheck the "Use the Default Gateway on remote network" to not use the gateway of the VPN for your internet traffic. 
12. OK all of the windows of the connection properties. 

Now we will write a Powershell script that will run in the background and check for connection drops and reconnect if necessary. 
  • First let's create a folder on the C: drive and call it Script.
  • Let's create a file in that folder and call it vpn.ps1
  • Inside the file let's put the following code (using Notepad or another text editor):
$ip = "10.0.20.10"
$result = gwmi -query "SELECT * FROM Win32_PingStatus WHERE Address = '$ip'"
if ($result.StatusCode -eq 0) {
 Write-Host "$ip is up."
}
else{
 Write-Host "$ip is down."
 Write-Host "Disconnecting..."
 rasdial.exe YourVPN /DISCONNECT Write-Host "Connecting..."
 rasdial.exe YourVPN vpnUsername vpnPassword12345 
}

Make sure you put in the correct IP of the NAS (first line of the script) while it's inside the VPN. If your VPN IP pool starts from say 10.0.10.2 that means the server address is 10.0.10.1
Also make sure that after rasdial.exe you put the name of your VPN dial-up connection, its username and password (where it's stated).

Now we fire up powershell (just type powershell inside the Start menu search and run it in elevated mode), and we need to set it to accept running scripts. First, lets run this command:

get-executionpolicy

If the answer is "Restricted", run this command:

set-executionpolicy unrestricted

This will ask you to confirm, press Y.

After this is done, all that's left is setting up the task scheduler to run this script.

Inside Task Scheduler create a New Task, make sure that it runs with highest privileges and with the user logged on or not. The Trigger should be at Startup, Repeat the task every 5 minutes. Under Actions leave it as Start a program, point the path to C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
add an argument to run the script:
-File C:\Script\vpn.ps1


For the method of a stable VPN connection I used the tutorial posted here.

Now, after we're done setting up a stable VPN connection - it's time to enable Offline Files. (Offline Files are available only on Professional, Ultimate and Enterprise versions of Windows)
Browse to your server using its VPN IP, find the folder you'd like to sync, right click it and choose "always available offline" . It is best to create a mapped drive for this folder, so it's easily accessible when offline. Also make sure no one else is using this shared folder because sync conflicts may happen.
In Windows 8 Pro you might first have to enable Offline Files by going to Control Panel > Sync Center, click on Manage Offline Files, and then Enable Offline Files.

Tuesday, May 20, 2014

Sync network folders to OneDrive automatically

A few weeks ago I have written a post about synchronizing your OneDrive local folder to a network folder, in this post, however, we will talk about synchronizing server folders to the OneDrive folder automatically using a Robocopy script.

First off let's create sub-folders inside your OneDrive sync folder that will represent your network shares. For example if we have a server named Server and shared folders on it named Folder1 and Folder2 - we shall create them as subfolders inside our OneDrive sync folder, as:
(assuming the system partition is C:)

C:\Users\username\OneDrive\Folder1 (and other shared folders)
or
C:\Users\username\OneDrive for Business\Folder1 (and other shared folders)

Then, after all of our subfolders are created, we create a batch script containing:

@echo off

 robocopy "Source network folder" "Destination OneDrive Sub-folder" /MIR /COPY:DT /MON:1

We will have to create a Robocopy command like this for each one of the network folders that needs to be synced with OneDrive.

If the server requires credentials, you will have to add a line before the Robocopy command:
net use \\servername /USER:username password

After the Robocopy command you might want the script to log out, using the following command:
net use \\servername /d

This is a syncing script, that's why we're using the /MIR switch, this means that if we delete the file in the source folder, the file will soon be deleted in the destination folder. If you wish to copy files and not sync them, you can use /E /COPY (or /COPY:DT) instead of /MIR.

I also should mention that /COPY:DT switch in the above script is optional, it instructs Robocopy to preserve the files' Date and Time stamps but not attributes. I just prefer using it in this situation for better stability.

Now we can save the script as a BAT or a CMD file and run it (preferably in elevated mode) to see how it works. The script will never close unless the command prompt is closed manually, so this may create an annoyance to the user. That's why it may be a good idea to automate it's execution and keep it hidden from the user.

To do that we go into Task Scheduler and create a task to run this script. I recommend the trigger being the user's logon. Because if the trigger is a certain time of day - there may be duplicates of the same process, because the process never stops anyway.

Make sure you set the SYSTEM account as the account running the task, and mark it Hidden. This will allow the script to run invisibly in the background. This starts the cmd.exe and robocopy.exe processes and they only take a few hundred kilobytes in memory.





Saturday, May 10, 2014

Windows 8 Single Language image for download



Here you can find a Windows 8 Single Language installation image. This image is in Portuguese (Português) and Spanish (Espanol).

DOWNLOAD Win8_EM_x64.iso

It is possible to change the language later using my post about changing the language of Windows 8 Single Language.

You can use the product key of another version of Windows 8 Single Language to activate this image (for example if you'd like to re-install Windows on your laptop).

Wednesday, April 30, 2014

Build an energy efficient computer



Here I will show a sample configuration of an energy efficient workstation.

This build, in my view, will provide power efficiency and stability as well as good performance.

PSU - Any 80 Plus (and up - such as Bronze, Silver, Gold, Platinum and Titanium)
Top brands include: Corsair, Seasonic, Cooler Master, Thermaltake, Antec and others.
I would recommend Corsair AX series power supply (rated 80 Plus Platinum) as it would waste the least power.

CPU - Any Intel 4th Generation CPU which model ends on T.
I recommend i7-4765T as its TDP is at 35W. So at its full power it won't go beyond 35 watts.

Motherboard - Leading low watt brands include Asus, EVGA, Gigabyte and Asrock. A gaming motherboard will typically consume much more power than a mini-ITX motherboard.
I recommend Asus Z87i-PRO - good performance and low power.

RAM - There is a low voltage type of DDR3 called DDR3L which uses less power, also emits less heat. Brands include Kingston HyperX Lovo or Genesis and Adata XPG, among others.
I recommend Kingston HyperX LoVo - Kingston's special line of low voltage ram.

Hard Drive - Western Digital Caviar Green or Seagate Barracuda LP for HDD, Samsung 840 Series for SSD.
I would recommend a Samsung 840 EVO or PRO series drive.

Optical Drive - most optical disc drives are the same power, however a slim drive version will consume less power than the standard version.

Monitor - Any LED backlit LCD monitor, preferably with high Energy Star compliance. The smaller the size of the screen - the lower the power consumption. Here's a list of 18-22 inch monitors and their wattage.



Tuesday, April 29, 2014

Turn an IBM X3100 M4 Server into a workstation.



This is a neat little server that comes from IBM that officially doesn't support Windows 7. In this post I will guide you through making it run Windows 7 using the built in LSI Raid controller and a RAID mode.
If you decide to use AHCI mode however, it becomes easier (as it is supported by Windows 7), but you will be missing a performance (or stability) increase you'll get with the RAID.

The LSI raid controller on this server does not have drivers that support Windows 7 - only Windows Server 2008 (R2). There's also a great many driver packages available from IBM for this controller, so choosing the right one is quite time consuming. After 4 different failed driver packages I finally found the right one that would enable the RAID controller inside the Windows 7 installation PE. I have uploaded it HERE.
Just unrar and copy it onto a flash drive and browse to it during the installation (when it fails to find any hard disks).

Once the system finishes installing - most of the drivers will be installed, the only ones that won't be are the Chipset and the onboard video (Matrox G200ER2). Download the driver for it here.
Chipset drivers are available here.

Monday, March 10, 2014

Sync files to OneDrive and a local server together automatically.

I have always thought that keeping valuable data in more than one place is essential to productivity. So it's always best practice to sync your files to an external location, especially in the work environment. A good stable sync is what we will talk about in this post.

Recently, I migrated one of the companies that I work for to Office 365. As some of you may know - Office 365 packages come with the OneDrive service that provides 25GB of online storage.
So up until the migration - the users' data was automatically synchronized to the in-house file server (using Offline Files). There is a great many ways to synchronize data from a user's machine to a network location - from using Offline Files, to scripting, to third party software. In this scenario I will talk about synchronizing to both a local network location and a cloud service (OneDrive for Business in this case) without using any third party software.

First, let me point out that OneDrive currently provides one sync folder on the computer where everything that you want synced has to go. You cannot designate other folders to be synchronized to OneDrive unless you copy them to the sync folder and continue to work from there. So let's say you do that but you'd still want the same data synchronized to your local file server. And you don't want to use third party software that may create system instability, or load it up.
So, we're going to accomplish this task by using a Robocopy monitoring script that will be running invisibly in the background. All the time.

So we create a batch script that will contain the following:

@echo off
robocopy "Source OneDrive Folder" "Destination network folder" /MIR /COPY:DT /MON:1

Now (assuming your system partition is C:), your source OneDrive folder is usually located at

C:\Users\username\OneDrive
or
C:\Users\username\OneDrive for Business

Your destination should be a local folder or a network share. Don't forget the quotation marks.

You might also need your shared folder credentials if the share is password protected, so to log into the share correctly the script would have to present the correct credentials. To do that you will have to add a line that logs into the share (before the Robocopy command):

net use \\servername /USER:username password

After the Robocopy command you might want the script to log out, using the following command:

net use \\servername /d

This is a syncing script, that's why we're using the /MIR switch, this means that if we delete the file in the source folder, the file will soon be deleted in the destination folder. If you wish to copy files and not sync them, you can use /E /COPY (or /COPY:DT) instead of /MIR.

I also should mention that /COPY:DT switch in the above script is optional, it instructs Robocopy to preserve the files' Date and Time stamps but not attributes. I just prefer using it in this situation for better stability.

Now we can save the script as a BAT or a CMD file and run it (preferably in elevated mode) to see how it works. The script will never close unless the command prompt is closed manually, so this may create an annoyance to the user. That's why it may be a good idea to automate it's execution and keep it hidden from the user.

To do that we go into Task Scheduler and create a task to run this script. I recommend the trigger being the user's logon. Because if the trigger is a certain time of day - there may be duplicates of the same process, because the process never stops anyway.

Make sure you set the SYSTEM account as the account running the task, and mark it Hidden. This will allow the script to run invisibly in the background. This starts the cmd.exe and robocopy.exe processes and they only take a few hundred kilobytes in memory.


You can also check out my post on how to sync network shares to OneDrive automatically.

Friday, January 10, 2014

How to optimize Windows 7 for an SSD drive




As some of you may know, an SSD drive has a limited lifespan of around 100-1000 written terabytes (here's an interesting link that shows some statistics on SSD lifespans). Therefore it is best practice to minimize the amount of data written daily onto it, especially if it's a system disk.

Here I will show how to move the most data intensive system folders to another drive, and to tweak the system to write less onto your SSD operating system drive, under Windows 7.

It is important to note that if you care about the lifespan of your SSD drive, it is always a good idea to have an additional drive in your configuration (another (less valuable) SSD or another HDD drive) to store your user files and to minimize the load on the main SSD drive.

First we start by moving the default pagefile to your other drive's partition by going into Start > right click on Computer > Properties > Advanced System Settings > Advanced (tab) > Under "Performance" click on Settings > Advanced (tab) >  Click on "Change" under Virtual Memory. This will open a new window called "virtual memory" where "Automatically manage paging file size for all drives" will be checked. Uncheck the box near that sentence and click on your system partition shown in the list of all partitions. Now click on "No Paging File" radio button and click on "Set". This will remove the pagefile from your system partition. Now, some people who have 16GB RAM or more in their machine leave Windows running without a pagefile, but I do not recommend it because some problems may happen, such as when explorer.exe suddenly bloats to over 15GB of RAM.
Now click on a different partition (located on your other drive) and click on "System managed size" radio button, and click on "Set" to enable it. Click OK and restart your system. Once you boot back, you will see that your system partition has more free space than before, on the account of us moving the pagefile to another partition.

Next we should disable Hibernation, which also create a temp file on your system partition (which CANNOT be moved) called hiberfil.sys. We do this by going into command prompt in elevated mode and running:

powercfg.exe /hibernate off

This will immediately disable hibernation and erase the hiberfil.sys file. If you must put the computer to sleep - use sleep mode if your motherboard supports it correctly, or simply turn off your machine. Your SSD is fast enough to boot real quick with no need for sleep mode.

Next - we move on to the system's temp folder, user's temp folder and temporary internet files.

First off, we should create a main temp folder on the secondary hard drive. Let's call it MainTemp. Inside it - create a folder for the system's temp files, let's call it SysTemp. Now for the user's temp files - UserTemp.

To change the system's and user's default temp folder location go into Start > right click on Computer > Properties > Advanced System Settings > Advanced (tab) > Environment Variables...
Here you will see two fields that show User Variables and System Variables. User variables will usually only cover TMP and TEMP entries, which you should both change to the path you crated on that other drive to the user's temp folder that we called UserTemp.
In the system variables, scroll down to the same TEMP and TMP entries, and change their paths both to the new folder we've created on the other drive, called SysTemp.
So if the other drive's partition is D, then the user's TMP and TEMP should point to D:\MainTemp\UserTemp
And the system's TMP and TEMP should point to D:\MainTemp\SysTemp.

After restarting, these changes should be enacted.

Now, we move on to the Temporary Internet Files.

With Internet Explorer it's quite easy to change the temp folder location.
First we should add another folder under our MainTemp folder on the secondary drive, let's call it IETemp.

Then, go into IE, click on the Alt button on the keyboard, this will bring up the menus. Click on Tools, Internet Options, under Browsing History click on Settings, click on Move Folder and navigate to the newly created temp folder on that other drive. Click on OK, it will ask you to save new location and reboot, click on Yes, and the computer will reboot.

Google Chrome however does not let the user easily change the location of the temp folder.
Let's again create a new temp folder for Chrome on our secondary drive, let's call it ChromeTemp under Maintemp.

What we have to do is create a shortcut to Chrome on the desktop (if it's not already there), and right click it and choose Properties. This will bring up a window with the shortcut settings. In the "Target" field we have to add a command that will point Chrome to the new temp folder we created earlier.
The existing path is something like this:

"C:\Program Files\Google\Chrome\Application\chrome.exe"

We want to change it to:

"C:\Program Files\Google\Chrome\Application\chrome.e­xe" --disk-cache-dir="D:\MainTemp\ChromeTemp"

Click OK to save the new shortcut changes.

So now using this new shortcut - Chrome will store all the temporary files on the secondary drive.

Hopefully these tweaks should free up more space on your SSD and let it live longer.