Skip navigation

Everyone is flipping out over a picture of a blue-and-black dress. Just when I thought cat videos were the only thing to freak out about. Some say it’s white and gold, others say it’s blue and black. It seems that the white/gold perception is almost squarely with women and the blue/black perception is largely with men. Why is this? Many people have played with color correction to “prove” their answer is the One True Answer(TM) but the reality is that the dress is actually blue with black accents. Want proof? Here’s the dress in a catalog to show that it is factually blue-and-black:


Is this a dress or a vase?

I know, I know, you’re saying “but I want to know why women see it as white and gold!” It’s simple. Here’s the top portion of the dress image:


It’s gold and white. If you disagree, I’ll tweet bad things about your lawn.

It’s difficult to see ONLY this part and not think that it could be white and gold. The black part clearly has an incandescent spotlight above it somewhere which bounces off the semi-shiny black portion to give the appearance of a gold hue. The extreme backlight in the upper-right corner that is blowing the picture contrast out pretty severely gives the impression that the entire dress coloration is tainted by shadowing caused by the light source behind it; this combined with the gold “hint” from the incandescent light will cause anyone who looks at the top of the image first to mentally and subconsciously “auto-correct” their color perception to compensate. Thus, if you look at the top first, you’re seeing a white dress with gold accents. Let’s take a look around where the center body mass would be instead:


It’s blue. It’s not white. Don’t be such a racist.

If your first glance is closer to the center of the body, you’ll see a lot less of the gold “hinting.” Because the black is generally darker and the overall brightness of this section of the image is lower, the blue looks more blue and less white.

Let’s be honest with ourselves about typical instinctual human behavior here: men look at the body first and move around to get the whole picture; women size up the person they look at from top to bottom. Men see the dark part first, women see the light part first, and that’s why they perceive it differently. If the same visual tricks and erroneous hints were somehow swapped, the perceptions would also be swapped. There is also the fact that men and women perceive color slightly differently anyway, with women being more capable of distinguishing slight changes in color and men being better at detecting motion, contrast, and bigger changes in general; it could be that the superior color perception of women works against them given this atrocious lighting and terrible quality camera.

For reference, this is the full dress photo everyone’s so worked up about. What color is it? What color did you see it as when you scrolled down? If you scroll very slowly down without looking directly at the photo, even if you’ve seen it as blue/black every time before, you’ll probably see it as white/gold and immediately wonder if you’ve been slipped a hallucinogen via the Internet. I know that’s how I felt, anyway.

You are looking directly at the end of human civilization as we know it.

You are looking directly at the end of human civilization as we know it.

By default, every version of Windows since XP creates thumbnail database files that store small versions of every picture in every folder you browse into with Windows Explorer. These files are used to speed up thumbnail views in folders, but they have some serious disadvantages:

  1. They are created automatically without ever asking you if you want to use them.
  2. Deleting an image file doesn’t necessary delete it from the thumbnail database. The only way to delete the thumbnail is to delete the database (and hope you deleted the correct one…and that it’s not stored in more than one database!)
  3. These files consume a relatively small amount of disk space.
  4. The XP-style (which is also Vista/7/8 style when browsing network shares) “Thumbs.db” and the Windows Media Center “ehthumbs_vista.db” files are marked as hidden, but if you make an archive (such as a ZIP file) or otherwise copy the folder into a container that doesn’t support hidden attributes, not only does the database increase the size of the container required, it also gets un-hidden!
  5. If you write software, it can interfere with software version control systems. They may also update the timestamp on the folder they’re in, causing some programs to think your data in the folder has changed when it really hasn’t.
  6. If you value your privacy (particularly if you handle any sort of sensitive information) these files leave information behind that can be used to compromise that privacy, especially when in the hands of anyone with even just a casual understanding of forensic analysis, be it the private investigator hired by your spouse or the authorities (police, FBI, NSA, CIA, take your pick).

To shut them off completely, you’ll need to change a few registry values that aren’t available through normal control panels (and unavailable in ANY control panels on any Windows version below a Pro, Enterprise, or Ultimate version). Fortunately, someone has already created the necessary .reg files to turn the local thumbnail caches on or off in one shot. The registry file data was posted by Brink to SevenForums. The files at that page will disable or enable this feature locally. These will also shut off (or turn on) Windows Vista and higher creating “Thumbs.db” files on all of your network drives and shares.

If you want to delete all of the “Thumbs.db” style files on a machine that has more than a couple of them, open a command prompt (Windows key + R, then type “cmd” and hit enter) and type the following commands (yes, the colon after the “a” is supposed to be followed by an empty space):

cd \

del /s /a: Thumbs.db

del /s /a: ehthumbs_vista.db

This will enter every directory on the system hard drive and delete all of the Thumbs.db files. You may see some errors while this runs, but such behavior is normal. If you have more drives that need to be cleaned, you can type the drive letter followed by a colon (such as “E:” if you have a drive with that letter assigned to it, for example) and hit enter, then repeat the above two commands to clean them.

The centralized thumbnail databases for Vista and up are harder to find. You can open the folder quickly by going to Start, copy-pasting this into the search box with CTRL+V, and hitting enter:


Close all other Explorer windows that you have open to unlock as many of the files as possible. Delete everything that you see with the word “thumb” at the beginning. Some files may not be deletable; if you really want to get rid of them, you can start a command prompt, start Task Manager, use it to kill all “explorer.exe” processes, then delete the files manually using the command prompt:

cd %LOCALAPPDATA%\Microsoft\Windows\Explorer

del thumb*

rd /s thumbcachetodelete

When you’re done, either type “explorer” in the command prompt, or in Task Manager go to File > New Task (Run)… and type “explorer”. This will restart your Explorer shell so you can continue using Windows normally.

I decided this month that it was time to look at replacing my AMD Phenom II X4 965 BE chip with something that could transcode high-definition video faster. Sure enough, I chose the AMD FX-9590 CPU. Arguments against the AMD FX-9590 on forums such as Tom’s Hardware and AnandTech include “power efficiency is too low/TDP is too high” and “Intel has higher/better instructions per clock (IPC)” and “Intel’s i7 performs so much better.” Notably, the price to obtain the superior Intel performance was almost completely ignored in these discussions. Consider that the AMD FX-9590 retails for around $260 and the Intel Core i7-4770K it is often compared to costs $335; that $75 difference is enough cash to buy a cheap motherboard or a 120GB SSD, and also represents a 29% price increase over the FX-9590. Does the i7-4770K really perform 29% better than the FX-9590? The short answer is “no.” The long exception to that otherwise straightforward answer is “unless you spend all of your time calculating Julia mandelbrot sets and the digits of pi.”

Over two years ago, I wrote an article about how AMD CPUs beat Intel CPUs hands down when you factor in the price you pay compared to the performance you get. Most of the arguments I received against my assertion were against the single-figure synthetic benchmark (PassMark) I used to establish a value for CPU performance. This is understandable; synthetic benchmarks that boil down to “One Number To Rule Them All” don’t help you decide if a CPU is good for your specific computer workload. This time, I’ve sought out a more in-depth benchmark data set which can be seen here. I compiled some the relevant figures (excluding most of the gaming benchmarks) into a spreadsheet along with the Newegg retail price of each CPU as of 2014-10-23, used a dash of math to convert “lower is better” scores into an arbitrary “higher is better” value and some fixed multipliers per benchmark to make them all fit into one huge graph which can be downloaded here: cpu_performance_comparison.xls

And now, ladies and gentlemen, the moment you’ve been waiting for: a graph of a wide variety of CPU benchmarks, scaled by the price you pay for each CPU (click to expand the image.)

amd_fx-9590_vs_intel_core_i7CPUs in each bar series are ordered by retail price in ascending order. The FX-9590 is in yellow on the left of each series and Intel only has a CPU that beats the AMD offering in 4 out of 17 price-scaled benchmarks, most of which are synthetic and don’t represent any typical real-world workloads.

AMD wins again.

Update: In case you needed more proof that the FX-9590 is the best encoding chip, someone sent me a few links to more x264 benchmarks: 1 2 3


Get every new post delivered to your Inbox.

Join 70 other followers

%d bloggers like this: