My Home Theatre

It’s been a hiatus… I’ve been a bit busy lately, it’s summer time and all… I’ve been enjoying the beautiful weather with my family but I feel like I’m due, so here we go!

Home Theatre! This topic interest me big time, ever since I bought my “new” house 5 years ago I had been planning to do something nice in terms of an entertainment space. The space I had in my mind was a bit different then your typical audio/videophile types dream of but it’s what I dream’t of at this stage in my life.

I will be the first to admit that this post is late to the game, and I anticipate to upgrade my projector and receiver to native 4k within the next 6-8 months. I have my eye on you Optoma UHD60!

Coming from my previous house (a shoebox) I had a big room to actually call my mancave, a 25’x16′ room. The picture below is pretty unflattering… and it only shows the room from one angle, but this is all I could find for the time being… This was a couple days after we moved in. In hindsight I should’ve taken more before and after photos for this project.

mancave_before-1

Here is the original conceptual design of what I envisioned the room to actually become.

mancave-schematic

First step was to build the A/V closet and shelving, the closet did not exist originally so I had to rip out some drywall and attach into the existing framing. Here is a before picture of where I put the closet in.

AVcloset-before

Here are some pictures of the AV closet build out. The shelf design I found on another site, if I can find it again I will give them a kudos link, man is it a solid design – homemade shelf that can hold a ton of weight and gear. All of the supplies for the shelf I purchased at Canadian Tire and Rona. All of the cabling, connectors, wall plates and in ceiling speakers I purchased through Monoprice.

 

Here are pictures of what it looks like today, don’t mind the mess I have a few kids.

Projector Mount

If you look at the projector mount picture below you’re probably saying wow that’s a crazy mount is this guy a nutcase? Actually it’s pretty much mandatory in my mind to design something like this if the projector is going to be installed in a basement like setting.

When I originally mounted the projector I was truly a newbie… I affixed it directly on the floor joists, what a mistake. The feedback was vicious and the projector was bouncing like no tomorrow… and when it started to bounce it really didn’t recover quickly since there was nothing to absorb the movement that reverberated off the joists.

This is something I came up with through trial and error, this works for me, it doesn’t eliminate movement entirely, if my kids are bouncing off the walls upstairs it will shake, but it’s absorbed quickly by this design and I can rest well knowing that my investment is safe. To date I have almost 5000 lamp hours using this rig and the projector and lamp still lives on.

I used a large piece of MDF that spans three joists, I tapped into the joists using 2 1/2″ wood screws. From there I lined up where the projector was going to be mounted and penciled in four pilot holes where the bolts were going to be installed. These four bolts affix the actual mount to the MDF base, they are 3/8″ in width in my application, the bolts are fairly long I believe around 3 1/2″. I used several washers, rubber grommets and springs as you can see from the photo, these items are doing a lot of the hard work to minimize any vibration and impact.

projector-mount

Projector

At the end of 2012 I was on the hunt for the right projector for me. I didn’t want to spend a ton but I wanted a projector that was a good bang for the buck, but mandatory was good input lag and 3D. I stumbled across the BenQ W1070 Home Theatre DLP Projector. It’s a great unit, I’ve been using it now for almost 5 years, so it’s done really, really well… no issues whatsoever.benq_w1070

 

Screen

I went the Do-It-Yourself route. After abundant research I ended up using the following for the screen paint:

Sherwin Williams ProClassic Smooth Enamel Satin Finish Extra White – 6260 UNIQUE GRAY. I don’t believe Sherwin Williams carries this formulation anymore.

Down the line I believe I will switch to an actual screen for my next projector install. Don’t get me wrong the paint is great and a money saver, but I found that it cannot cover imperfections in your actual drywall. If you look close enough you can pick up on these subtle things while the unit is on.

I used a somewhat dark color for the rest of the wall around the screen. Sherwin Williams Classic 99 Satin Finish Extra White – 6549 ASH VIOLET.

The screen is approximately 110″ measured diagonally.

Screen Frame

I used 2 1/2″ MDF trim I mitered the corners at 45 degrees and installed L shaped hinges on the back side. I primed and painted with flat black paint and used brad nailer to affix it to the wall.

Speakers

I opted for a 7.1 configuration, the front and center speakers I got a sweet deal on from Newegg, they were on clearance dirt cheap… I could not pass it up. I picked the JBL Studio 1 Series Studio 190 Front and Center speakers. For the subwoofer I went with the Klipsch KW-100, for the sides I went with Klipsch RS-62s. For the in ceiling I went with Monoprice 6-1/2 Inches Kevlar 2-Way In-Ceiling Speakers.

The in-ceiling speakers I cut a plywood template to hold the speaker since I have a drop ceiling with soft fiberglass tiles. The plywood template fits into the 2’x2′ grid and the grid take the weight of the speaker and not the tile.

For the Klipsch surround speakers, I mounted the speaker to a stud on opposing walls using a single screw.

I am not an audiophile but they sound good to me, most would recommend not mixing and matching, but really for me I was going with the best value/deal at the time as speakers can be really expensive for something better then bottom of the barrel.

AV Receiver

For the receiver I went with the Onkyo TX-NR616, I had never purchased an Onkyo before but I can say I have been really happy with it.

The receiver can not fully power my front speakers in it’s current 7.1 configuration, if I used 5.1 it can power them fully, but the sound is still good, I don’t pump it too often… just something to keep in mind if you are purchasing an AV unit.

The one issue I have had which seems to be some kind of glitch where HDMI switching stops working after the projector is turned off, it doesn’t happen all the time… it is a random thing. Simply recycling power on the receiver corrects the issue.

IR Repeater

For extending my IR remotes (satellite receiver, AV receiver, etc…) I went the cheap route. I picked up a USB powered IR repeater from Amazon – Neoteck IR Repeater Infrared Remote 1 Receiver 4 Emitters Control Kit. I just plugged it into my AV receiver’s USB port to get power, and I installed the IR receiver discreetly along the edge of my drop ceiling. It’s cheap but it does the job and I can close my cabinet if need be and not have to fight with pointing remotes directly at the device.

Fix graphical desktop artifacts in crossfire.

Tools:

Hawaii Bios Reader

Atiflash 4.17

Dos boot disk

HxD hex editor

Hawaii Fan Editor

I have scoured the internet for a solution to my long standing problem with my crossfire setup. After much digging my searches yielded no results. I noticed a problem where the cards when in Crossfire would artifact if they were sitting idle on the destop. I have the problem documented here.

Inside my computer I have two R9 290x cards by Gigabyte in crossfire, these are the Windforce editions. The exact model is GV-R929XOC-4GD, one uses the F2 BIOS the other uses the F11 BIOS. When I game the temps on average are about 60-70 degrees Celsius on the GPU cores, and about 95-100 degrees on the VRM. My CPU doesn’t exceed 45 degrees. Cards are at clock speeds and both BIOS versions are the same, I recently updated the BIOS on both cards, but that did not fix the issue.

In short I can do about an 2 hour gaming session and everything runs smoothly, then when I exit to desktop I get artifacts, lines coming across all 3 monitors, but as soon as I go into a game again these lines disappear. Back to desktop the lines re appear again. I bring up anything graphical like a web page or youtube, the lines will disappear, if I minimize the browser the lines reappear. If I stay on the desktop and disable crossfire, again the lines will immediately disappear.

I initially suspected it was the fact that I was running a crossfire set up. My other suspicion was that despite both cards being the same make one has memory chips by Hynix(F11 BIOS) and the other by Elpida(F2 BIOS). I believed that the problem was with the memory or rather something to do with the memory.

Note worthy, when only running a single card this artifacting problem does not occur. It only happens in crossfire and when the cards are in a low power state mode, idle, or rather when the clocks are dropped to conserve energy.

After much tweaking of the system and performing various tests it all came down to the Memory Clock, the clocks on the memory were being stepped down to almost nothing. The reason I suspected the clocks is that when I went into a graphically intensive application the problem disappeared. And the reason I knew it wasn’t the Core clock and it was the memory clock, the core clock would clock up on demand but the memory clock would not, it had two states 150 Mhz or 1250 Mhz, and it only propped up to 1250 when something graphical was being presented on the desktop or a game was being played. During “power play” mode the cards core clock drops to 350 from potential 1040 and memory drops to 150 from 1250. Mind you the core can be stepped up on demand and it does this rather well the memory apparently not so much.

To edit the BIOS files and flash them they will require a *.rom extension. The files from the manufacturer did not have this extension, I renamed the files to include the .rom extension and flashed them using Atiflash, it worked and my cards are running fine.

In order to fix the issue I had to hex edit both the cards BIOS files and flash it with AtiFlash in DOS. I also disabled ULPS. Although ULPS is not a fix to the issue I like knowing that when I hop out of a game the fans will keep spinning to cool down my card to an acceptable temperature. I don’t like the idea of one card being passively cooled after it reached 80 degrees +. I essentially edited both the cards BIOS files to never drop the Memory clock, so now the memory clock is always at 1250Mhz. And this fixed the problem. There are other tweaks to the bios I made as well, and while not necessary I also edited the BIOS core clocks, the core now never drops below 500 Mhz, the next step up is 840 Mhz, and then 1040 Mhz. This was changed from 300 Mhz, 727 Mhz, and 1040 Mhz respectively. Below is a screen shot of the PowerPlay profile changes, original on the left, and edited on the right. Capture1

Finally I also changed my fan profiles and a single temp profile. Since I raised the Core clock slightly and the memory clock completely I wanted to make sure that the card was not running hot. So I raised the fan profiles by 10% and dropped the top temperature profile by 10° C.

Capture3

New version of Hawaii Bios reader on left can edit the Fan Profile

The single temperature profile I was worried about was the 90° Celsius/100% fan, I changed it to 80° Celsius/100% fan speed. Then I raised the other fan speeds by 10%, so 56 went to 66%, and 25% went to 35%. You can see below the changes I have made to the Fan profile as displayed in Hawaii Bios Reader. Note that although Hawaii can read the Fan profiles these need to be changed in a hex editor such as HxD, only the PowerPlay values can be changed in the Hawaii Bios Reader. Alternatively you can use the Hawaii Fan Editor by DDSZ. The new version of the Hawaii Bios Reader can now edit the fan speeds and temperatures on the Fan profile page, it is no longer necessary to hex edit the ROM file.Capture2

The last step after the BIOS was edited I had to flash the file using Atiflash with in DOS. Download the boot disk and create a dos bootable flash drive. Place the rom file and atiflash in the root of the flash drive. Boot into dos and flash the new BIOS for your card. Remember to only do one card at a time and to power down after each flash. Also flash one bank at a time, I have my original and the new BIOS on each card, I used the performance bank to flash the custom BIOS. Atiflash usage is as follows:

atiflash -p 0 biosname.rom

With all these changes to the GPU BIOS on both cards I now have eliminated the Desktop artifacts. My idle card temps hover around 50° C, ~ 3-5 degrees higher than the stock BIOS clocks. And ULPS is disabled. Everything is peachy on the gaming PC.

Here are the two sample ROMs I created for my cards, F2 and F11.

For more detailed information check the below links and sources.

Disabling ULPS: Open regedit and search (Edit – Find) for EnableUlps then change the DWORD value from 1 to 0. Ignore EnableUlps_NA this does nothing. Keep searching (pressing F3) through the registry and change every entry you find in there from 1 to 0. Once finished reboot. Although disabling ULPS is not necessary I like it because with this feature off the driver does not disable the secondary card after a gaming session, which in turn allows the fans to cool the card properly instead of just shutting it down.

Editing the VGA BIOS: I used tools such at Hawaii Bios Reader, it is capable of creating a proper BIOS check sum in order to flash the card. Essentially in Hawaii Bios reader I edited the frequencies of the clocks then I proceeded to change the Fan and Temperature profiles with a Hex editor, I used HxD to do that. Be aware that if you use HxD after you use the Hawaii tool, you will need to open the hex edited file and resave it in Hawaii to it retains the right check sum for flashing. Other wise the card will not take your custom BIOS.

Sources:12, 3, 4

PC Gaming accessories: Mouse

I’ve always used Logitech mice, except for a long long time ago when I had a dell mouse which i used with one of my first gaming/school PCs. But even prior to that I had used the basic Logitech mice. I’ve tried alternatives but I find that with most companies they don’t have the ergonomics that are necessary for prolonged gaming sessions. Logitech has done ergonomics well on mice for a long time now and it is one of the few companies that still does. I still own my older G5 Laser Mouse and it’s kicking around in one of my drawers as a backup. Somewhere else in a box, stored under my staircase, hidden in there lies an old MX 510 which I picked up at staples while I was in College. The MX 510 replaced my Dell stock mouse. Then a wireless MX700 replaced that, the G5 Laser replaced that, ang again the G700 replaced that. Then recently I thought I broke the G700, I slammed in on my desk in furious moment of rage and frustration, this messed up the scroll wheel functionality. Having thought I broke the scroll wheel I replaced the G700 with a wireless and optical G602 mouse. Yes sometimes I rage when I play games… sometimes.

From my personal experience in over 2 decades of PC gaming I can confidently say that Logitech mice are rock solid, they also have a good ergonomic shape to them. I’ve tried other mice but they just don’t cut it for me like the logitech ones, they don’t have the comfort or reliability. Logitech are comfortable mice and they can take a good beating. Having said this these mice are not without their flaws, I have not found the perfect mouse yet. Even though this migth be somewhat biased on my experience go ahead and check it out.

Also for the heck of it I will toss in a couple notes about the older mice, just to reminisce a little. But mostly this is a comparison of a couple Logitech wireless gaming mice. Note that with most wireless gaming mice from Logitech they are a bit heavier due to the added weight of the batteries. This suits me just fine, I like a bit of a heavier mouse for gaming, a lot of people don’t. If that is the case a wired mouse would be the way to go.

MX510: Great ergonomic feel to the mouse, it is a basic optical mouse and designed for non gaming. I used it for gaming a lot. It was comfortable and light. It’s still in a box in my home.418E7H3ZYHL._SX300_

MX700: This was one of my first forays into the wireless/gaming mouse territory. Great ergonomic feel and shaped much like the MX510, but to be honest I wasn’t happy with this mouse. I paid $120+ for it at the time and kept it for less than a year. I gave it away to someone. The battery life on these was terrible and the performance not much better. I understand sacrificing short battery life for a wireless experience but this mouse had very few benefits to owning it.MX700-930754-0215-R-unit

G5: Solid laser mouse performer, it has an adjustable weight tray that slides in and out of the bottom of the mouse. This was my first laser mouse, I didn’t know what I was missing with optical mice until I bought this. Weight can be adjusted by inserting and removing 1.7g and 4.5g weights into a weight tray the slides in and out of the bottom of the mouse. The threaded cable shielding proved to be a burden, and I had to eventually remove it completely. The cable underneath the shielding would get trapped and fold through the shielding. I would worry that this would make the wire inside the cable tear and this is why I removed it entirely.

G402: Dubbed the Hyperion Fury. Good basic optical mouse, comfortable and good for gaming. Light for the smaller hands as to not over accelerate when aiming or moving. If you are looking for a wired gaming mouse that is non laser this would probably be it. Although the cost of $50+ is a little too much for this. Personally I do not worth the money.logitech-g402-hyperion-furyG602: After having owned laser mice this optical wireless device lacks the smoothness and the precision that I find I get out of laser precision. I find that the profile of this mouse is a little too low as well. Since I like to rest my palm on the mouse, and my hands are rather big, in longer gaming sessions my hand will start to cramp up. The battery life is amazing on this device, it takes 2 double AA batteries, and has a switch on top of the mouse to switch it into performance or endurance mode. Endurance mode allows it to sip juice from one battery at a time, and grants the user 1400 hours of device usage. Performance mode on the other hand allows it to shine in gaming at full power giving you a lesser 250 hours of use. I assume the switch changes the polling rate at which the mouse tracks movement across a surface. Personally I find this mouse ok for gaming, I’m not a fan of optical technology as any imperfections in your polling surface, such as grease and dust will mess with the optical sensor and it’s accuracy. Other than the optical sensor, the mouse it is fully featured and does not lack in other functionality, although the construction feels a little flimsy. The software is ok, just ok, but then again I never use the G software with any of the Logitech devices. I don’t much care for macro functionality as I don’t play MMOs and the software annoys me more than anything else, so it’s never on my system. Since the battery life on this is rather long this device omits the USB cable/charging option like the G700/s has. It’s a shame as the last thing you want to be doing during a gaming session is fumbling for a fresh pair of batteries. One thing to note about this mouse is that the Middle Mouse Button was dead right out of the box. This seems to be a hardware defect as stated by Logitech to one of their user on the forum. This defect also seems to be plaguing a lof of devices across this line of mice, so it is not a matter of getting a lemon out of the bunch but rather a hardware design flaw. Also the G602 has 6 thumb buttons, that’s 4 to many, and since the software is useless, that’s 4 buttons without a function. I haven’t tried it but perhaps they can be mapped directly in game via key bindings.g602_1

G700: The g700 is  a work horse of all the mice I have had. It is solid and takes a beating. Trust me, it knows how to take a beating, I’ve abused this mouse. I like the larger profile on the mouse and it is very comfortable for larger hands in longer gaming sessions. What the g602 lacks the g700 has, precision, comfort, and a short battery life. The batteries don’t last nearly as long as they do in the g602. Having said this unless you do 12 hour gaming sessions that will not be an issue, also unlike the g602 the g700 has a micro USB plug at the front so you can charge the battery and game as if you would with a wired mouse.g700

The G700s is it’s successor with minor alterations to the design, such as the sensor and the coating on the mouse which add better perspiration protection and grip. Mostly cosmetic changes over the G700. Worth every penny you pay for this mouse. I have had my g700 for 3+ years, it hasn’t skipped a beat, it’s amazing for gaming and I highly recommend it. I bought one for my buddy on his Birthday last year, he loves it as well.logitech-g700s-910-003584-rechargeable-gaming-mouse

Mousepad: The surface on which you game makes a huge difference, especially if you are using an optical mouse, not so much with a laser mouse. The optics of a mouse can be impeded by dirt, dust, or any other form of surface that is not uniform. Also try using a glass surface with an optical mouse, it will not happen. A proper mouse pad will not only provide you with a clean surface for a mouse it will also provide you with a surface that gives very little friction and resistance to mouse movement. Your acceleration will not be impeded and depending on the type of mouse pad might actually improve. I prefer the hard mouse pads over the cloth ones they are generally a low resistance surface that aid in mouse precision.mouspad

Mouse software: Do not use the G software with these mice. I noticed that the software resets the mouse values and settings, which was odd, but it did that. As soon as I removed the mouse software, immediately the mouse started functioning properly again. That is a huge fail on Logitech’s part. Other than that I appreciate the mice, they are basic, ergonomic, and for the most part reliable. 

Nevermind the Oculus Rift, I’ll take a Microsoft HoloLens.

The Windows 10 presentation and the Microsoft presentation took me completely by surprise. Windows 10 and the Xbox gaming experience, the nice integration of Steam into the Xbox app and other neat little features. Let’s not forget the Microsoft Surface Hub, all in all not too shabby.

Let’s get back to my main point. Everyone has been all over the Oculus Rift for the last year or so, and VR in general. Recently even Samsung released their version of a VR headset, Gear VR. Personally I think that Gear VR is a complete waste of your money. You are limited to Apps on a closed software eco system and you are limited to one manufacturer, and at the moment one phone. All aboard the fail boat. Also all these VR headsets promise only 1080p resolution split in half, each half for each eye. In an age where UHD televisions will be taking over soon, and where 2560×1440/1600 is pretty much the norm for computer monitors, VR headsets have a little catching up to do. Next gen PC video cards will be able to handle UHD gaming as well. Also I’d like to mention the Canadian equivalent of the Oculus, the Totem VR by a Canadian company from Montreal.

Then there is the fact that VR is very anti-social. You close your senses off to the rest of the world and delve into one just by yourself.

Insert the Microsoft HoloLens. Have a look at the commercial.

This isn’t just a device for entertainment, this is also a collaboration tool. A multipurpose device that uses Augmented Reality (AR) to display computer generated images, and video streams around your home. Microsoft had to actually create a new Processing Unit for this. A processor which measures and calculates your surroundings in order to render images and sounds via the HoloLens and it makes sure they are rendered in the appropriate space. I’m talking about the new HPU or Holographic Processing Unit.

The applications for this processor alone are amazing. New mapping technologies, and new ways to map areas. Imagine a quadrocopter fitted with one of these HPUs sending back telemetry data in real time, you could map and navigate areas that were previously inaccessible to humans. It is said that NASA is using the HoloLens as a collaboration tool for it’s Mars mission.

Surely this is a hit for Microsoft, I personally can not wait to get my hands on one of these devices. I have shifted from contemplating getting a VR headset to most definitively getting a HoloLens. Well played Microsoft. Get your wallets ready.

UPDATE 2016/03/09: Never mind the Hololens, Microsoft over promised and under delivered. The Hololens is a steaming pile of shit. The HTC VIVE came out swinging and I pre-ordered that. Room scale VR is where it’s at at the moment.

The state of Video Game affairs in Canada.

Ever since the inception of the $69.99 price point for video games here in Canada, I’ve been very loud about how we the Canadian consumers are getting ripped off. This is very evident if you walk into your local Best Buy, which I did on the weekend. You will see that some game pre-orders are 59.99, others are 64.99, and then you have most at the 69.99 Price point. Nobody has officially come out and said that the price hike is due to the exchange rate, and there is only online speculation, which is sound and logical. Personally I think the 69.99 price hike it is due to greed as there is no consistency in a price point between software publishers, aaaaand a 10% exchange rate difference should yield a price closer to $65 not $70. Funny that when the US dollar was trading at less than the Canadian dollar we were still paying the same as the US.

More recently the PS4 hardware price was raised and went up from 399.99 to 449.99, an additional $10 profit for Sony per PS4 sold in Canada. The Xbox One however maintained it’s original price. A few weeks ago Microsoft announced a Kinect-less Xbox One that will be selling at the 399.99 price point. I was curious to see what the price would be for this in Canada. To my dismay I learned that this Kinect-less Xbox One can be pre-ordered on the Best Buy Canada site for 399.99 as well. Interesting that a hardware manufacturer is willing to take a slight hit on their hardware but the majority of software publishers have actually increased their profits beyond the point of the exchange rate. Sorry it’s not interesting, it’s very greedy. I would like to thank Microsoft for showing us Canadian gamers some love. Just like they did when they lowered the 360 price in Canada first.

xboxone

Best Buy Canada screen shot.

Right now I’m going to pick on a single game that I have a slight problem with, Wolfenstein The New Order. One thing that irks me about a title like the new Wolfenstein, is that it does not have any multiplatyer, online or co-op game play aspect to it, yet they warrant charging 69.99 for the software. This game at best should be 49.99, even with the 10% exchange increase… but I digress. To be honest I was going to pick this title up when I learned of it, until I also learned that it is a purely single player game and that it would be selling at $69.99 CAD.

So the question everyone should be asking is, does a 10% exchange rate difference warrant a 17% price hike? And will the price drop once the exchange rate balances out?

For your consideration, below is a historic look at the USD to CAD exchange rate. Ranging from 1990 to now. See that huge spike prior to January 2000, this is where I purchased Turok 2 for the N64 and taxes in it cost me about $120, ouch.

exchange

On the other hand PC prices are more forgiving. This is partly due to the fact that Steam and others like Greenman Gamin deal in US currency. Dealing in US currency reflects the true pricing and not this inflated cash grab.

steam

There are ways to get around this ridiculous pricing on the Xbox One. I’m not sure if this is possible on the PS4, but let me know if it is. If you buy digital on Xbox Live and use PayPal to pay for your purchases, you can temporarily switch your Xbox One to the US region, purchase the game and then switch back to your Canadian region. Online users have reported this as working. This way you pay what the game is actually worth, except for Wolfenstein.

This brings me to another point, why the heck are we paying full price for video games sold to us digitally? Games that have no resale value and do not impact sales of new software. Yes we have to consider the bandwidth and server storage, but come on we have effectively eliminated the need to print a cover, press a disc, mould a case, ship to a distributor, ship to a retail store, have to travel to said store…. etc. You get the point. So why is digital the same price as physical, it makes no logical sense. An incentive in a form of a discount for digital distribution would be most welcome, and would push digital over physical.

I understand the 64.99 price hike but going to 69.99 is just plain greedy. I’m not sure who is setting these prices, weather it’s the distributors or the retailers. One thing I know for sure is that the Gaming industry is run by dinosaurs that do not know how to adapt to new and emerging technologies and like the music industry is driven by corporate greed not profit.

Fix your dead or noisy video card fan.

Recently I had the opportunity to take apart and fix a pair of AMD HD 5870 reference cards. I didn’t have to take them both apart, and I only needed to fix a noisy fan on one of them. A noisy fan is usually due to the grease or oil drying out in the ball bearing chamber of the card. This card gave me a good 4 years of service before it started making noise. Although after fixing the fan of one the cards, I noticed how quiet it got and that it’s temperatures and fan speed were lower than before. So I decided to do the same to the second card.

I’d say that a pair of these cards, provided you have at least 16GB of RAM in your computer, is still relevant in the PC gaming space. Why 16 GB of RAM, well these come only equipped with 1GB or DDR5 memory or VRAM, and that is not enough to get to Ultra settings on most games, so instead on certain software the system will use your computers RAM to fill what it cant fit in the VRAM in short. Note that RAM is much slower than VRAM and there may be some frame drops going this route. So two of these are capable of achieving the Ultra definition setting in most games @ 1920 x 1080 resolution and 60 frames per second. So I decided to ship these two cards to a buddy of mine, but first I had to lube them up

Here is what a reference 5870 from AMD looks like. All AMD reference cards are built in a similar fashion and this process is applicable to them.

 SAPPHIRE HD 5870 1GB GDDR5 PCIE (Game Edition)

 

Here are the materials that were used to tune up the cards, jewellers screwdriver set, a pair of pliers, thermal paste, a mixture of water, vinegar, and lime juice, alcohol works as well and so does cologne as it’s alcohol based, and 3 in 1 all purpose oil. Optionally a can of air might help to get the dust out of the fan and heat sink fins.

If you want to test your cards temperatures and fan speed prior to this, install Furmark, it will allow you to stress test the card (Burn-in) and capture the metrics. So for the 5870 as for many of the reference cards flip it over to the fan opening is facing down and remove the marked screws. The ones in red are a different size than the ones in green. There are also two screws on the port face.

Once you’ve removed the screws, and lifted the back plate and the bracket in the middle carefully separate the PCB from the heat sink and fan assembly. Slowly and with care start prying the PCB from the fan and sink assembly at the sides, closest to the port plate. Keep prying with your fingers while moving towards the other end. Be careful not to separate the PCB from the assembly with too much force as on the other end the fan is connected to the PCB and you don’t want to rip out the connector.

Note that the only thing holding the PCB to the heat sink assembly is the dried thermal paste. It should not take a lot of force to pry it away, but you should still take great heed and care while doing this as not to damage any of the transistors or resistors, etc. Otherwise you’ll need to bust out the soldering iron. Also be careful pulling the PCB apart because on the opposite side to the port plate is the fan connector, you should pry the PCB about a centimetre away from the assembly then fold it open as seen in the image above, so you can detach the connector from the PCB. You should rest the PCB on a non static towel or mat after this. Be careful with the thermal stickers on the cooling unit, unless you have spares make sure you keep them in tact.

Next we’re going to take apart the fan and heat sink assembly so we can clean it and lube the fan. Remove the screws circled in red, at this point you may also want to remove the screws circled in green, those belong to the fan. Note that if you do remove the ones marked in green, be careful removing the plastic housing and make sure the fan doesn’t go flying out.

 

 

See that circular sticker on the bottom side of the fan, usually underneath it is an opening and access to the bearing chamber. With normal fans you should be able to heat it up with a blow dryer and peel it back to expose the opening and be able to put in 3-5 drops of all purpose oil. Well it’s not the case with these reference cards, these have that part sealed off. Our only option it to pop the motor out and put the oil in through the top. In order to do that you need to wedge in 3 screwdrivers between the corners of the triangular plastic piece that is attached to the motor and the fan. It should be a snug fit. Once you have done so apply equal downward pressure on the ends of all 3 screwdrivers, and once you hear a loud pop, stop. Now lift and remove the motor from the fan. Now you can add 3-5 drops of all purpose oil into the bearing chamber, try to only get it in the chamber. Once you have done so, push the motor and fan back together until you hear a snap. Spin the fan while holding the motor for a minute or two so that the oil circulates in the chamber.

Below is a video demonstrating the removal and separation process of the motor from the fan.

 

The oiling of the bearings and removal of the dust, made a huge difference in both cards. I was able to knock off about 4 degrees in temperature and 4% in fan speed in both. Also to note one of the cards started running silently again.

Happy gaming!

UPDATE 2016/03/09: I have since used this method a couple times. I bought an R9 290X off eBay and 2 out of 3 fans wouldn’t spin. Using the above method I brought them back to life and have been using the card for over a year without hiccups.

Mantle, first impressions in Battlefield 4.

MantleBenefits

Last week DICE released an update to the Battlefield 4 game on the PC. This update supports the Mantle AMD Graphics Processor Unit API. A day later AMD released their AMD 14.1 beta v 1.6 driver which supports this API. There are known issues with this driver so be weary. All graphics processors based on the Graphics Core Next architecture are supported by Mantle. I’m not going to get into details about the architecture or which devices are supported, but if you would like that information just head over to the AMD site. I had a chance to quickly test the performance of the API and Game. This was a very brief test and there are no metrics associated with this, only my experience.

FBMantle

Either way my setup is as follows:

Eyefinity with 3 Dell IPS monitors @ 1920×1200, with 16:10 aspect ratio. Gigabyte Windforce OC HD7970 card. AMD Athlon II x6 1090T processor, with 16 GB or RAM, and everything stored on SSD drives.

pc

Collectively in eyefinity the max resolution achievable on this hardware is 5760×1200. However I have been playing Battlefield 4 @ 5040×1050 resolution which relates to a 16:10 aspect ratio monitor with 1680×1050 resolution. Truth is the system could not handle the higher resolution and playing on low was not an option. That resolution was being played on high with a couple tweaks.

With mantle and the BF4 update, I was able to go from 5040×1050 resolution in BF4 to 5760×1200. That is pretty good, I thought I would have to wait another generation before I could max out my resolution on my monitors.

What does Mantle do? Mantle helps and assist the rendering process by optimizing it and sending batches from the processor to the GPU in parallel. DirectX is supposedly also capable of these parallel communications, but developers have not yet been able to achieve this with DirectX, and instead it sends the batches in serial. Basically with Mantle the information gets there faster, there is more of it, and it is optimized for the hardware as it is not a generic API meant for all hardware.

What does this mean? This means that if you have a slower multi-core processor and a mid range to high end video card you will see some performance increases. As it is in my case I was able to go from 5040×1050 resolution to 5760×1200, without any performance hits. Prior to mantle this was not possible on DirectX and the hardware.

What about high end PCs? Higher end machines, with really fast processors that are capable of fast single threaded processing will not see a big performance boost. Mantle is meant for the Mid range, the AMD sweet spot. There are also other hardware limitations to Mantle, lower end GPUs are not seeing a great performance boost either. Hardware is a factor here. The sweet spot seems to be mid range CPU with a mid range to high end GPU.

Mantle shows great promise for the future, especially for games that will be built from the ground up to support this technology. I’m also sure that there is some more performance that can be squeezed out of the hardware and API, it is still in it’s infancy.

Now if DICE could only fix the broken game.