When you shop through links on our site, we may earn an affiliate commission. This educational content is not intended to be a substitute for professional advice.
A small form factor graphics card is a graphics processing unit (GPU) that is designed to fit into smaller computer cases. These cards are typically shorter and have a smaller footprint than their full-size counterparts, making them ideal for building small form factor (SFF) PCs.
While small form factor graphics cards don’t usually offer the same performance as their larger counterparts, they can still be capable of playing modern games and running demanding applications. Many SFF GPUs also support features like multiple monitors and multiple graphics cards (SLI or CrossFire), making them a great option for power users.
Meets a need for low-end graphics board, in a flight simulator PC. Has 2GB of graphics RAM, which meets the threshold for X-Plane to use the Vulkan API and that boosts performance. Also, it works with ‘shared video RAM’ on the motherboard (Ryzen 5 5600G) to use the APU’s graphics features.
With a single HDMI and DVI-D connector it boosts the total video ports (two on motherboard) for a multi-monitor flight sim configuration. It’s not the highest FPS but can be very adequate for flight simulation.
Only GDDR4 RAM and there are other variations with GDDR5 for a bit more cost.
I bought this for my 20 year old games 1999-2015 Games !!! Just Outstanding LITTLE POWERHOUSE ! You see back then my Video cards were 256megs DDR3 cards! But this killer card can do high resolution and incredible smooth gaming on my old games everything turned up !! EVEN GTA 4 & GTA 5 !!! Battlefield 2 wow ! & 3 so great! RETURN TO CASTLE WOLFESTEIN !! and so Many others !! Far Cry 1 & 3 were mind blowing.
smooth! It’s small uses 30w of power. fits anywhere. any PC ! IF YOUR GONNA PLAY NEW GAMES GET A 1650 6G DDR6 SUPER OC’d Best graphic card bang for buck. High 1080P frames. ! But if you Wanna play old games and Kick butt and get insane frame rates to! This little baby Jewel is a Dimond in the ruff for old programs & old games feeling like a Large killer past graphic cards typepower.
in a small little package ! This thing (GT 1030 OC 2G DDR5 basically runs like a past 5870 2G DDR5 XFX oc’d Special Edition exactly!! So you have and Idea. what your getting when you buy it and it’s better than and old 5870 in newer titles.
!! So if you cannot get a GTX 1650 !!! and this is a must play older titles you’ll be happy.
I purchased this video card for a system dual booting Ubuntu and MidnightBSD. It works fine for a causal desktop. It works up to 2k resolutions fine. The card can’t run at 4K due to the old hdmi standard which limits to 30hz.
This card is horrible for gaming so run away if that is your use case. You can play counter strike on it but newer games are out of the question. The card has low power requirements and can run off the pcie slot without extra psu cables.
Dominate the race track or battlefield with this GIGABYTE GeForce GT 1030 graphics card. The 2GB of GDDR5 memory and 384 CUDA processing cores let you play titles seamlessly, while the lowprofile design fits in a wide array of systems, including ATX cases.
This GIGABYTE GeForce GT 1030 graphics card has DVI-D and HDMI ports, which support multiple UHD monitors for an immersive viewing experience. Powered by the NVIDIA GeForce GT 1030 graphics processing unit (GPU) With a 1252MHz clock speed and 1506MHz boost clock speed to help meet the needs of demanding games.
2GB GDDR5 (64-bit) on-board memory Provides the memory needed to recreate game visuals with striking realism. PCI Express 3. 0 interface Offers compatibility with a range of systems. DirectX 12 and OpenGL 4.
5 support Enhances gameplay by delivering ultrarealistic images, detail and texture and improved lighting effects that create lifelike interactivity. DVI and HDMI outputs Enable flexible connectivity.
Do you want a graphics card to game with? You might want to look elsewhere. Do you want to surf the web, perform productivity tasks, or have your cable T. V. streaming to your monitor? If so, this is your card.
A 300 watt power supply forces you to be a tad stingy on juice going to the card and so far I haven’t regretted this choice. Anything over what you need is a waste and under what you need will disappoint.
I’m not much on sticking out my chest and crowing about how I have the latest, greatest, most “up-to-datest” graphics card. I want what I need. no more. no less. This card fits the bill nicely. Let others spend tons of money.
Unless you are serious about games, this card is better than UHD 630 and isn’t going to cost you a fortune. Heck, I’m running a 27″ 4K monitor on this card. A 1080P would be no problem.
Experience superior graphics performance with the ZOTAC GeForce RTX 3060 Twin Edge OC video card. Built on NVIDIA Ampere architecture, it features blazing-fast GDDR6 memory at a core clock speed of 1807MHz, as well as improved ray tracing and tensor cores to enhance your gaming and content creation experience.
It boasts 12GB of video memory that can handle up to 8K resolution with ease.
If you got a 2060 super or any 20 series. Don’t upgrade, the performance difference isn’t much more. Benchmarking on 3DMark about 5 times cause i was in disbelief that the scores where like that based on how its games (it games awesome) , it only has 100 to 200 points gain.
This card however does handle RTX way better and smoother then a 2060 super. The cooling is way better then all the cards I’ve ever owned. If your going from a 1660 and below to a 3060. Definitely worth it.
But if you still have a 20 series and looking to upgrade. Look at going towards a 3070 or above or just dont upgrade tell the 40 series makes its debut. The 12GB of Vram you get never reaches that amount on any game even when rtx at max, so you got alot of play room, also can play the card on max graphics setting on all my games so far.
Still gotta test Forza Horizon 5 lol. Overall, good card, awesome performance and awesome cooling. Price is still ehhh but gpus are going down so maybe wait a bit. Everything above that was said, was based off my experience playing on a 1080p, 144hz monitor.
So if you use a 1440p or 4k. Your usage is definitely gonna vary. Should still work good fantastic for any application you throw at it.
For the price, you are getting a 10% upgrade over the 2 year old RTX 2060 and a 71% upgrade over the 5 Year old GTX 1060 6GB. The RTX 3060 TI boast an impressive 32% gain over the RTX 3060 for usually just hundred dollar more than this one (unscalped prices).
It has Ray Tracing enabled, is factory overclocked, and also features the NVIDIA Limited HashRate tech v2, which is a hardware side mining limiter that reduces this card’s mining power by 30% to 50% to reduce it’s appeal to Crypto Miners.
For the cooling, it has two fairly large fans with multiple fins that is using a “twin Edge” branding, a common cooling setup by Zotac which they claim helps cool more than a regular two fan setup. Under heavy load, the “twin Edge” cooling setup was not very noisy, the card temp also did not reach over 70C, which is quite good for a 170W rated GPU.
The card is equipped with 12GB GDDR6 video memory, which is a bit overkill to play modern games. However, this amount of memory makes this card future proof for apps and games that requires more every year.
The card can last you many years with proper dust removal and maintenance. I would recommend this card for 1440P gaming, with a 120hz monitor or less. This is perfect for an intermediate gamer that wants performance, at a budget.
Hwoever, You may suffer performance drop using this card whjen gaming with higher resolutions like 4k, unusually wide monitor or faster refresh rates. In the end, i recommend this card for the budget gaming station.
it has a good price per performance ratio and is usually one of the most available models during the shortage. However, if an RTX 3060TI is available at a non scalped price, i would recommend getting the 3060TI for the better price per performance ratio and the future proofness of that card over the 3060.
This graphics card takes 24 cm in length and only 2 pice slots’ width. Compared to other 5060 options, which usually takes ~30cm and 2. 75+ slots, this allows the installation into smaller computer case.
For me, this successfully fits into the space of a Lenovo p330. I haven’t done benchmarks on it while it greatly overruns the original p400 that shipped with the p330 workstation.
With improved high-density heatsink redesigned high-performance heat pipes Dual Bios Dr. Mos VR Ms and a RGB the Power Color RED DEVIL Radeon RX 6950 XT Radeon RX 6750 XT and Radeon RX 6650 XT graphics cards once again will deliver best-in-class gaming solutions.
I have been waiting years to buy a card. Held on to my dual titan x’s a bit longer than I wanted. But, with the way things are what choice does anyone have. I’ve also been a nvidia fan since the beginning having originally ran 3DFX back in the day.
With that being said, NVIDIA is a bit off with avail and pricing on cards. I have waited years to buy something from team green and it just hasn’t been possible, and I refuse to pay double for a card.
I then decided to wait until the 4000 series. At this point (2 almost 3 years later) I was fully prepared to pay any price to get a 4070 or above on release date. With the price and performance of this card , i just couldn’t wait anymore.
Came out and bought it the same day and its worth every single penny. I couldn’t be happier. This card performs well and plays everything I throw at it and at times better than a 3090ti in some games which with the price i paid for this is insane.
One note to all. The min power spec for this particular card is 950 watts and it means it. I had a 850W and even at idle I noticed some weird gremlins. Things cleared up when I switched the bios switch from OC to silent.
I was then able to put it back on OC when I got a 1250w power supply. For anyone that is looking for a card today, for the price, this is the way to go.
This card is a workhorse. It handles everything I throw at it and still has room to grow with the features that are still, to some degree, in development like Smart Access Memory (SAM) to fully utilize the full card chipset.
Originally I purchased the RX 6650 XT and while it was great I couldn’t help but want just a little bit more and I got it with this purchase. Super happy with this card as it doesn’t leave much extra to be desired.
It is everything I saw when I reseached it online. It was priced at MSRP and is absolutely the fastest video card I have ever bought. Did my own benchmarks with original AMD Adrenaline drivers and udated to newest drivers and it is even faster now, just as AMD claimed it would be.
VR Ready Discover next-generation VR performance, the lowest latency, and plug-and-play compatibility with leading headsets-driven by NVIDIA VRWorks technologies. HDMI 2. 0b, DisplayPort 1. 4 and Dual-Link DVI The latest standards in DisplayPort and HDMI interfaces.
NVIDIA GeForce Experience The essential companion to your GeForce graphics card. Capture and share videos, screenshots, and livestreams. Keep your drivers up to date and optimize your games. NVIDIA G-SYNC Compatible Get smooth, tear-free gameplay at refresh rates up to 240 Hz, plus HDR, and more.
This is the ultimate gaming display and the go-to equipment for enthusiast gamers. Microsoft DirectX 12 API, Vulkan API, OpenGL 4/5 Power new visual effects and rendering techniques for more lifelike gaming.
NVIDIA Ansel Turn your screenshots into art with this powerful in-game photo mode that captures 360, HDR, and super-resolution photos Game Ready Drivers Get the highest levels of performance, and the smoothest experience possible, from the moment you start playing.
Maybe I got a lemon or something, but this GPU generates way too much heat to be cooled by a single fan. I’m seeing this card run at a constant 90c with occasional spikes above 90c which causes the card to throttle down in order to get back under 90c.
In Sekiro it introduces ungodly FPS lags/stutters (which is a death sentence as that game requires precise timing on your blocks/deflects) because of the throttling that occurs. In Final Fantasy XIV I had to turn the settings down to Normal in order to maintain a buttery smooth 60 FPS @1080P, this card couldn’t even get 60FPS@1080P on High settings.
I’ve since bought myself a 5500 XT (with dual fans of course) and not only does that card maintain a cool 70-80c temp, but I’m able to crank my games up to High/Max settings and get that smooth 60 FPS @1080P.
Not knocking the 1660 Super overall, just that this particular card needs more than a single fan in order to stay cool. I’m kinda miffed I missed my return window but I guess I’ll sell it second hand and recoup some of my cash.
If you’re going to get a 1660 Super, BUY ONE WITH DUAL FANS!.
If you are trying to decide if getting the RTX series cards is worth it ask yourself what resolution you will be playing in. Anything higher than 1080p is going to severely hinder the performance on this card even with a good cpu.
Stick to RTX for 1440p or 4k gaming especially if you are using a high refresh rate monitor. I paired this card with a 32 inch LG G-sync model and i wish i would have spent the extra 400 on a 2080 for good 2k fps.
Otherwise this card is amazing for regular HD gaming and i would recommend this to anyone with a 1080p 144hz monitor Review Edit. I’ve had to RMA this card twice. it worked for a few months both times but both of them died the same way, they wont post.
I’m not sure whats going on but its very frustrating. If you are new to pc building just buy a pre-built. Building was fun but now its a nightmare.
Replaced a dead 1070 with this card. Same performance for half the price. I had a 960 as my backup gpu after the 1070 died and it would barely run any games on lowest settings. This 1660 super is a champ, very fast and stable.
The EVGA software is simple and easy to navigate, and doesn’t bog down the computer the way a lot of software does (looking at you gigabyte garbage). Highly recommended for 1080p gaming on a budget. One of the other reviewers said this takes two 8-pin power cables, but it does not.
It takes ONE 8-pin power cable. I paired this up with a Ryzen 5 2600x on a MSI B450 Tomahawk MAX with 16 gigs of 3200 G. Skill memory, powered by a 10 year old Antec 650w psu. I play World of Warcraft and PUBG and I stay well over 100 fps in both games on high/ultra settings 1080p.
Stepping up to the GDDR6 memory is like going from a hdd to a ssd; you can just feel the snappiness and speed there compared to the 10 series cards.
The EVGA GeForce RTX 3080 delivers the unprecedented performance that gamers crave for 4K resolution gaming, powered by the NVIDIA Ampere architecture. It’s built with enhanced RT Cores and Tensor Cores, new streaming multiprocessors, and superfast G6X memory for an amazing gaming experience.
Combined with the next generation of design, cooling, and overclocking with EVGA Precision X1, the EVGA GeForce RTX 3080 Series presents a new definition in ultimate performance. Features NVIDIA GeForce RTX 3080 LHR Powered by the NVIDIA GeForce RTX 3080 graphics processing unit (GPU) with a 1755 MHz boost clock speed to help meet the needs of demanding games.
On-board memory 10GB GDDR6X 320 Bit. PCI Express 4. 0 Offers compatibility with a range of systems. NVIDIA Technologies Harnesses the power of the graphics processing unit (GPU) to optimize computing performance.
TLDR: Incredibly pleased with boost in performance, (especially when using Blender 3) while at times requiring less power than old card! Longer thoughts: Retired a 2015 EVGA GTX 980 Ti SC+ ACX 2. 0+ (part no.
06G-P4-4995-KR) which has been used quite a lot for rendering with Blender 3 since its release last year. Navigating viewports when in render preview mode is much more fluid for detailed geometry/materials, and final image rendering performance is considerably better! Extra VRAM also allows for higher-resolution renders without Blender running out of memory when denoising via GPU (OptiX).
GPU/OptiX denoising is an order of magnitude faster than CPU/OpenImageDenoise denoising. I only tested two Blender benchmark scenes with the 980Ti card, so while they may not give an indicator across the board of how much more powerful the 3080Ti is, their results were interesting.
Using 980Ti, GPU and CUDA API: bmw27_gpu, 1min 7sec, 194. 3 W max classroom, 1min 55sec, 216. 2 W max Using 3080Ti, GPU and OptiX API: bmw27_gpu, 9sec, 304. 8 W max classroom, 19sec, 332. 7 W max It definitely uses more power, but over less time.
Wattage was logged using the GPU-Z tool. Where it was most notable was with my own archviz scene, which for a single frame took the 980Ti approx 15mins to render. The 3080Ti renders the exact same image in 3min 32sec.
Gaming: most my games feel/look the same (!) as on the 980Ti – this is probably because they’re mainly old (2007-2018) titles, my system runs in 1920×1200, no 4K or anything fancy, and that the 980Ti could handle them without problems.
However, I did notice that Far Cry 5, Borderlands 3 and Shadow Of The Tomb Raider can be cranked up to their highest settings with the 3080Ti without any issues, although I suspect such settings for 1920×1200 resolution are overkill.
Some visual differences are subtle. ¯\_(ツ)_/¯ For the above games (and Max Payne 3 and Detroit Become Human) I discovered that the GPU isn’t being stressed according to GPU-Z, with it literally running in an “Idle” state according to the “PerfCap Reason” metric, all clock speeds are quite low and “Board Power Draw” averaged around 110 watts.
While I never captured these same statistics when using the 980Ti card, I recall that card never being “Idle” during gaming, (i. e. it was always busy and limiting due to either power or voltage, etc) with clocks constantly maxed out and power draw very close to its maximum of 250W.
Similarly, Ark Survival Evolved can have its visuals increased and it feels much more fluid than what the 980Ti ever did – even when using its lower settings. The game still makes PC sound like a jet aircraft, at times, though! Ark does make the 3080Ti work hard in this case, consuming 250+ watts, so similar to the 980Ti, except the experience is considerably better.
PC is a 2010/2011 machine using a Gigabyte X58A-UD5 motherboard, Intel i7 980X 6-core CPU at 3. 46GHz, 24GB RAM, running Windows 10 Pro build 19044. 2006, with nVIDIA 512. 15 drivers. Old hardware by today’s standards, but works brilliantly with the 3080Ti.
There was no need to change PSU – simply unplugged the 6 and 8-pin PCIE power cables from the 980Ti, swapped video card, plugged in both 8-pin and 6+2-pin power cables to the 3080Ti, job done. Power supply is a 1020W Enermax.
The use of two 8-pin power cables was a key reason why I chose this particular card because some other 3080Ti’s (and the 3090’s) appear to require three 8-pin connectors, which may have been an issue for my PSU.
Although the 3090 series has an extra 12GB RAM, their additional power requirements and only “slight” performance increase over the 3080Ti didn’t seem worth it for my use-case, specifically Blender. Moving from a 7-year old card to this was already a huge jump.
I’m very happy and would recommend this card if you’re making a similar upgrade.
I’ve been playing PC games for half my life at this point and to see where GPUs have come in 15 years is insane. Pretty sure my first real GPU was a GTX260 in 2008, then a 6850 in 2010, GTX 970 in 2017, and a 2080 Super I got really lucky with in 2022.
If it wasn’t for a super friendly and extremely helpful Microcenter employee at the Tustin store, I would still have a 2080 Super in my rig (not that it isn’t an absolute beast of a card, but now my girlfriend and I can game at 4K together!).
The EVGA XC3 Ultra is an amazing card. I’ve put tons of time into research to find the best balance of performance, cooling, and reasonable power consumption and the XC3 fits the bill for all 3. From day one of install to a couple weeks later it’s running amazing.
Whether it’s a 3D render in SolidWorks to just gaming with the boys, it handles it all flawlessly. Overclocking with the card is also pretty painless, although I haven’t noticed much of a difference in real life application besides benchmark numbers being slightly higher with higher power targets and higher memory frequencies.
EVGA really bins the cards well and doesn’t leave much performance off the table. I definitely understand that there is a huge diminishing return on a 3080ti compared to a 3090, but for my workloads I don’t need 24GB of VRAM, maybe in the future, but I’m not doing hardcore model renders at home and the time savings isn’t enough for me to justify another $400 on MSRP.
Although I bought open box and Microcenter guaranteed a good card with no issues and they delivered. The 20% off the sale price was also amazing. So if you have the chance to catch one at a 3080 original MSRP, I would definitely go for it.
While the RTX 3080 Ti XC3 stands in contrast to the FTW3 from EVGAs line, for me it had many attractive options. The XC3 had many elements that made it very attractive to my build. It has two 9 pin slots for power vs the standard 3 seen on other 3080 Ti cards.
this made it an easy drop in replacement for my other cards. Additionally, since it was so thin comparatively, it took up less board space and outlets. Swapping from my old card to the things are actually MORE accessible.
While I rarely need to get into the system to fiddle with things, a thinner card with simpler power requirements made for an easy swap. Performance wise, its fantastic. It comes with the cards title but the 3080 Ti excels at pumping out graphics when asked and I’ve rarely met a modern game that it can’t run at 4k 60 hz at a minimum.
I have not noticed any problems with heat. With 3 fans the card keeps itself quite cool under load and I’ve measured a peak temperature around 70 degrees for my build. A bonus is that card came with its own bracket to help hold it.
While I already had one, it is nice to see that manufacturers are acknowledging card sag and provide solutions in the box. My biggest misgiving is the RGB. There are two RGB elements to the card which are EVGA markers.
They are easily configurable and you can turn them off but the styling is still noticeable even with the RGB off. Overall, very satisfied. Amazing form factor that made a hot swap simple without much thought on power or space.
And the base 3080 TI performance I wanted out of the card.
The NVIDIA EVGA GeForce RTX 3080 Ti TFW 3 Ultra 12G Graphics Card delivers incredible performance for both gamers and content creators. The graphics card features NVIDIA’s Ampere Architecture, providing DLSS AI acceleration, and of course, industry leading ray tracing capabilities.
THe EVGA RTX 3080 Ti FTW 3 Ultra 12G graphics card features 10240 CUDA Cores, 1800 MHz boost clock speeds, and 12288 MB, 284 bit GDDR6X. While you’re pushing your gaming to the limit, EVGA’s iCX3 cooling technology ensures your GPU will stay cool.
The NVIDIA EVGA GeForce RTX 3080 Ti TFW 3 Ultra 12G Graphics Card was released on June 03, 2021 for a retail price of $1,400.
I am astonished at what this card is capable of. Even my high expectations of what a premium priced, 30-series RTX GPU could deliver were exceeded. EVGA delivers not only some of the best enthusiast features, but does so while seriously undercutting it’s competition.
When I walked in to buy my card, it was $500 less than the 3080ti sitting next to it. This is not at the expense of any quality or feature, but simply an economic effect. For once, the market leans in our favor.
The overclocking on this card essentially nullifies any advantages that the 3090 FE has over the 3080ti FE. The cooler is about as good as it gets for the kind of power/voltage that EVGA is putting through this card.
I wish EVGA were more aggressive with their default fan curves, but these are simple to customize. I saw moderate reductions in temps by increasing the fan speed in the 70-80C range, which is where this card operates during gaming.
I personally believe that this card is even overkill for 1440p. I decided to pair it with a 4k 144hz M28U, and I’m glad I did. I would feel bottlenecked with a 1440p monitor. Not only because this card is an absolute beast, but because DLSS will make up for any discrepancy between your FPS and screen refresh rate, if there is one.
As for really intensive titles, I’d especially argue for 4k. It’s truly a sight to behold when this card makes quick work of games you have only dreamed of. For a single component in your build, the price is hard to swallow.
But I will say this: in this GPU market, this is AS GOOD AS IT GETS. Brand new, from a revered retailer, who is pricing it with little to no inflation attached. Like I said, competing cards from other AIBs are inflated by $500 on the same shelf.
I’m expecting this card to take me through this decade, so all things considered, it’s a pretty good deal.
The 3080 Ti is Gen 4 pinnacle performance. You literally can’t do better with consumer based graphics cards without going (waiting for) Gen 5 (ie. PCI-e 5. 0). I bought this card for gaming and on the most intense and demanding game I play (Cyperpunk 2077), I was able to get consistent 63 FPS @ 1440p with all settings maxed out completely.
An impressive feat considering my last card had an FPS in the 20’s under the same settings/conditions. Now I can finally play it. Not every game though is as demanding as Cyberpunk 2077. Everything else I game with this card pushes close to 200 FPS.
Fantastic performance. Good thing I paired this card with a 240Hz monitor. If you want the very best Generation 4 has to offer, this card is your ticket to updating to the highest current standards. There is no such thing as “future proofing” so don’t buy this card for that.
Nothing is future proof. The graphics card is the one & only computer part I’d recommend over-spending on, and if you can find a way to afford it, buy a 3080 Ti. And heck, at the time of me writing this review, Microcenter is selling for UNDER MSRP.
The GeForce RTX 3070 Ti is powered by Ampere— NVIDIA’s 2nd gen RTX architecture. Built with enhanced RT Cores and Tensor Cores, new streaming multiprocessors, and superfast G6X memory, it gives you the power you need to rip through the most demanding games.
2 PCI slots, 2 x 8 pin PCI Express power connector
Power Consumption Operational
Dual 8-pin to 12-pin power cable
NVIDIA GeForce Experience
HDCP 2.3, DisplayPort 1.4a
Max Operating Temperature
Reviews From Real Customers
The Nvidia GeForce 3070 TI founders edition is a solid and robust GPU that seems to perform well in a variety of games. After an initial hiccup caused by an underpowered power supply, I upgraded to a 850 watt unit and have had zero issues since.
The unit itself is long and surprisingly heavy with that said I have not noted any significant sag but a support brace would probably be a good investment. The card itself has a neat aesthetic that you will either like or not but compared to partner units is definitely unique.
The power consumption under load might be the biggest drawback to this card and presumably the 3080 and 3090 series cards. However once an adequate power supply is in place you can expect upper moderate level performance.
The 3080 and 3090 cards are definitely more powerful but at what cost? Granted the rumored 40 series cards are even hungrier in terms of power. In my setup once running under load I saw temps around 68 to 85C.
Now I run five intake fans and I also have an AIO setup for my CPU which is a Ryzen 7 3800x on an MSI B450 motherboard and 32GB of Corsair RGB RAM so your milage on GPU temps may vary based on your setup.
I have not run bench marks to get FPS numbers but anecdotally I can say it is a significant upgrade from my old 1660 Super. I run at higher graphics settings with better performance no notable tearing or graphical artifacts.
Games played include Call of Duty Black Ops Cold War, Madden 22, Tomb Raider, Mortal Kombat amongst others. Overall I am pleased with the performance and have my system at a place where I feel it matches or exceeds current gen consoles.
Because of the noticeable improvements I have opted to give the Nvidia 3070 TI founders edition five stars.
This card is honestly the best bang for your buck especially for the price point! Review starts under this paragraph but I know these are hard to come by but patience is key with these items so please do wait until there’s more or consistent stock to pay at MSRP other than paying scalpers.
At this price point, the performance is so phenomenal that it’s hard to fathom how much value I got out of this card. I’m switching from a 2070 Super Founder’s Edition to this 3070 TI Founders Edition and the performance is noticeable.
While the 2070S is a great card and ran everything, the 3070 TI runs every game smoother than a hot knife cutting through butter. I’m running pretty much every game at max settings at 1440P without dropping under 100 fps; COD Black Ops Cold War is one of these.
My temperatures for the card are pretty good too, I haven’t seen the card go past 75 degrees Celsius under load. I also use this for editing and rending videos and it works just as well. The design of this card is aesthetically pleasing with the two tone color scheme.
This card took a similar design of the 3080 (most noticeably with the fan placement) while mixing it with the 3070’s design. The GeForce RTX logos do not light up like the 3080’s and above models, so FYI for those who are into RGB lighting.
The only downside I can find about this card is that it uses a 12 pin gpu connector which you’ll need to use the adaptor that it comes with that may make your case look a little messier but you can always purchase an off brand 12 pin gpu connector to clean it up a bit.
However, those last two points I just mentioned didn’t affect my score of this card’s review. I luckily went to an in person event and only waited 10 hours to get handed a ticket the day before so I didn’t have to camp overnight.
I just got there at 4 AM day before. If you have that opportunity, I’d say that’ll be the second best way to obtain these cards other than an online drop which can be a little more difficult. As with anything, just be careful when going this route and I recommend either befriending someone in line or bringing a friend with you to pass the time & hold your spot when you need to step out for a bit.
Overall, this card is amazing for the price and if you can get your hands on it in the future, you’ll definitely get your money’s worth. Just please be patient as these are slowly coming in more in stock at Best Buy.
It was nice to have the opportunity to purchase a video card at actual MSRP. Of course I had to camp out the night before release to actually get one. So was nine hours of extra sleep deprivation worth it? I appreciate the overall performance of this RTX 3070 Ti FE.
I also appreciate that the price of these founders edition cards is not being jacked up over MSRP like all the AIB partner cards. This card (RTX 3070 Ti Founders Edition) uses an included 12 pin to dual 8 pin adaptor.
It’s rather short so I’ve invested in a custom replacement (strictly for aesthetic purposes). It’s certainly a plus that they included an adaptor for the currently non-standard 12 pin power connection.
Note that Nvidia recommends at least a 750w PSU for this card and that you use two separate 8 pin cables from the PSU rather than a single to dual 8pin cable from the PSU to power it. Installation and setup is otherwise straightforward and just like any other graphics card.
Download and install the latest drivers and commence enjoying the high frame rates at high or ultra settings and ray tracing at 1440p 144Hz like me. I don’t, however, believe the slight performance increase over the 3070 justifies the massive increase in power consumption or the $100 price difference.
The major con is this card is power hungry to say the least. In my testing, under heavy loads at or near 100% the power draw jumps up to and sometimes over 300 watts. Don’t get my criticisms wrong. I immensely appreciate being able to get a video card and at MSRP with all that’s going on with the “GPU shortage” and other logistical issues at this time.
If I had the choice I would have preferred to get the RTX 3080 FE instead. I feel it’s the better value to performance deal. But I consider myself lucky enough to have gotten any GPU. Finally I have to say that I appreciate the way BestBuy handled the drop, allowing only 1 card per person helps insure that at least some of these cards get into the hands of gamers and video content creators, not just e-coin miners.
Powerful GeForce RTX 30 VISION series accelerates your work with incredible boosts in performance. Whether you are doing video editing, 3D animation, photography, graphic design, architectural visualization or broadcasting, it can save you a lot of time.
I’d been running a RX 590 for about 3 yeaars. I upgraded my monitor to a 27″ 144 hz monitor with a 1440p native resolution. I had a normal 24″ monitor at 1080p with 60hz. The new monitor looked great but the RX 590 struggled and ran very loud tio keep cool and was annoying – plus wouldn’t maintain 144fps without running greatly lowered settings.
This 3070 ti is so awesome. Disregard those on Youtube who might insult this card. It’s large. The heatsink is super good. The 3 fans are just right to keep it cool. In gameplay my cpu is at 25-30% utilizatiion and the gpu is 40-60%.
Temps run 45-60 deg C. Wattage usually is just below 200. In Time Spy wattage can go to 305 very briefly but usually is about 200. You must use 2 8 pin connectors from 2 separate channels on your psu.
The 6 pin to 8 pin types work fine too. One power cable to psu won’t work. In short, I love it and was happy to be able to get it.
It kind of goes without saying, a 30 series (NOT 3050) gets 5 eggs. Im ecstatic to have it in my build. Finally my 11900k is no longer underwhelmed by my old 1650. Ive said it all in the pros. This card will blow away any game you try to play on it.
Ive not thoroughly tested that on games like Elden Ring, Cyberpunk or New World (which might be the only concern though that happens even with 3090s) but Ive seen FPS of over 200 on ultra settings for basically any game that it isnt capped on.
170 is the lowest Ive seen. And all the while capping out at 70°C in the nOtOrIouSlY HoT NZXT H510 Elite and no jakeface custom front panel. Id recommend even with the looming release of 40 series. Maybe just wait for the price to drop a tad more.
I bought this to install in a “white RGB” gaming PC. Installation was easy thanks to the cooler master mid-size case. I did use (and recommend) a brace because of the weight of the card, but anticipated that and had already ordered the brace.
So far, no issues with video. I am using one of the HDMI outputs connected to an LG C1 48″ OLED TV. Looks great!. Most gaming is via the XBOX app and I am using the XBOX game bar to record my game sessions.
Right now 1080P 60FPS is more than sufficient for campaign videos and output is very nice. I haven’t tried to over clock yet and don’t know if I will. Update: 4/1/22 (not fooling) — occasionally video will just cut out.
No output from the card. A reboot will get things back. When this happens I immediately check temperatures and everything well within tolerance. No idea why this happens. A minor irritant unless I am in the middle of a LASO run.
Could it be power connection from a single VGA power cable split into 2 connections?.
Powerful GeForce RTX 30 VISION series accelerates your work with incredible boosts in performance. Whether you are doing video editing, 3D animation, photography, graphic design, architectural visualization or broadcasting, it can save you a lot of time.
Gigabyte does recommend a 650 watt power supply but I believe that’s only for cpus that use a lot of wattage. I am on an Amd Ryzen 7 3700x. On the 600 watt EVGA power supply I own it has only one pci cord that has the 8 pin and 6 pin connectors.
There isn’t a separate cable for each one so they are connected. I’ve been told by others it’s best to have them separated but the evga power supply I own doesn’t have separated. I don’t think that is what’s causing the weird lag spikes in games.
Team Red has advantages over Team Green and this is quite unexpected. I was expecting Nvidia to be much better since their graphics cards are a premium price tag. Yes they currently benefit with bette dls and ray tracing abilities but who knows how long that will last since AMD seems to be catching up.
Also I’ve always considered myself more of a Nvidia fanboy so I am a bit shocked right now. I wasn’t expecting to like some of the features AMD has over Nvidia. i also wasn’t expecting an older gpu and a weaker gpu in the rx 5500 xt 4gb to perform a little better at certain things and not have a weird lag spike like I’ve experienced so far with my gigabyte rtx 3070 8gb.
I haven’t had a nvidia graphics card since 2018 or so when I ended up selling my gigabyte gtx 1070 8gb windforce on craigslist. The biggest differences I’ve noticed so far is that AMD has better image sharpening and noticeably better.
Not just that but the Amd graphics control panel is much more attractive on the eyes and Nvidia is stuck in the past and their control panel is pretty yucky looking. It just looks so old and plain looking.
Some people might prefer that but I personally think AMD has the better graphics control panels. I also like how amd tells me what my average fps is in games without me having to place an fps average on screen.
In the control panel for AMD it tells you the average on each game you’re playing and how long you’ve been playing it. So far I have not seen this feature on NVIDIA’s control panel or Geforce experience.
Maybe I’m not looking at the right areas? Also I don’t know if my Gigabyte RTX 3070 8gb is defective or not and if it is it will eventually go back to either Newegg or Gigabyte but I have been experiencing weird lags at times and a couple times the computer restated itself all by itself.
I do not know what the heck that means. I don’t remember any of these weird issues when I had my gtx 1070 8gb but I wasn’t gaming at 4k on my gtx 1070 and I was only at 1080p resolution. I was expecting NVIDIA to blow away AMD but so far I have been very unimpressed for the $629 spent on this gpu and again maybe there’s something wrong with this graphics card and maybe it’s defective.
I also feel like the lags make it feel like it’s very underpowered. I was definitely expecting much better performance and I am on a Ryzen 7 3700, 16 gig, 1tb ssd. All in all so far I have not been satisfied about my update.
On my old rx 5500 xt 4gb which isn’t even a 4k card GTA V looked better at times with the image sharpening by AMD at 4k. My major complaint about the rx 5500 xt 4gb is that it had pop up graphics and some of the graphics definitely look a bit better on Nvidia.
I believe if I set grass to ultra or other settings to ultra in GTA V it wouldn’t look as detailed as it does on Nvidia but that could be because rx 5500 xt 4gb is not a 4k card at all and maybe on a faster AMD card such as rx 6700 xt or rx 6800 xt it could look better than it does on an rx 5500 xt.
I am quite shocked that AMD is better than Nvidia at certain things. The most obvious is the graphics control panel. It is much better on AMD in my personal opinion. I also feel like optimizing the graphics through AMD control panel is much easier to do than it is on Nvidia.
They even explain when you put your cursor over the ? on the AMD graphics control panel what each settings does and how it affects the games. List of graphics cards i’ve owned through out the years. I’ve had a lot of experience with Gigabyte cards.
I keep coming back to Gigabyte because they usually offer the best prices and I’ve also had a great experience with their customer support. My old GTX 470 died in the past and they replaced it with a used or out of box gtx 570.
I don’t know if it was used or not but it worked great and it was nice that replaced it. I’ve had issues with Sapphire and PNY where they wouldn’t honor their warranty. Gigabyte on the other had seems to honor their Warranty.
I sent back two cards one being a gtx 470 that died on me and the other a gtx 1070 which ended up having no issues what so ever and I believe Gigabyte found out that it was my system that was acting up.
I forget what it was but it could have been a dying hard-drive or something else but it wasn’t the gtx 1070. The Gigabyte GTX 1070 was probably my favorite Gigabyte card until I sold it. It ran like a champ at 1080p.
So, there I was looking through the Scan website (as I have done for the last 6 months) and refreshing the page every couple of hours in the hope of a drop and BAM. A list of 3070s appeared with prices and the Add to Basket option.
Panic set in as I’ve been here before multiple times since release, quickly scanning the list for the card I wanted. sadly it wasn’t available, however this one was. Trust me, if you’re reading this, I feel your pain.
This card fits nicely in my “There’s a shortage so this had to do” build. Paired with a R5 3600, 32GB Corsair RAM, Seasonic Focus 80+ 650w Gold, ASUS B550-F Gaming, all packaged in a Corsair 275R case.
The card itself. The card looks beautiful in its white finish, definitely a 2 power connector job (just to be safe). No bent power pins here, but then this is a 2021 card assembled in week 8 and from what I can see, does have the revised power connectors (do inspect this upon opening).
It is a little noisy when the fans are in full swing, but never seen it above 65’c, even in FS2020, you can obviously set a fan profile in MSI Afterburner. Speaking of Flight Sim, it can easily maintain 30fps in 4k with all bells and whistles enabled.
If you are using an older screen (like me), I would suggest enabling 1/2 refresh rate via the Nvidia control panel and locking to 30fps in Rivatuner – buttery smooth, otherwise should run fine with either G-Sync (spits) or Freesync.
Given the recent update from Asobo has caused some problems, expect anywhere between 45 and 60fps depending on location and aircraft. Also tested in Control with RTX on, 58-60fps and looks incredible (even though I have no idea what the game is about).
Jedi Fallen Order 60fps solid at 1080p with Ultra settings. It is a card just begging for 1440p with some ‘2080 ti’ style 4k thrown in. Is it bottlenecked by the R5 3600? The CPU has certainly awoken since installing this card, previously I had a GTX 970.
Usage is anywhere between 30 and 45% with temps upto 70’c. I eventually plan on replacing this with either a 5600x or 5800x, but I’m happy for now and you should be too! Some things to be aware of. If your OS is under MBR, you might hear a few warning beeps on initial bootup, this is well documented throughout the web for all 3000 cards.
Mine actually is MBR due to an emergency drive clone I did a year ago, and I heard no such beeps so YMML. The card has silent cooling, therefore the fans will only come on when the card reaches low 60s (again, set a fan profile and all is well if you so desire).
The price is understandable given the sheer lack of 30 series cards, I definitely overpaid but I wanted the card so. Compared to other sites, this was actually much cheaper anyway. My advice, if you see this card available on here (I will pray for you), don’t hesitate and buy it.
Aesthetically pleasing. Goes great with my NZXT S340 Elite white build. Solid performance compared to my 4 year old GTX 1080. I really wanted a 3080 but like everyone else have been trying to snag one since launch to no avail.
After battling with the scalper bots for months I got lucky and snagged the one card that looked best in my build. Was dumb luck because at that point I would have settled for whatever I could get. Went from 90 to 100 FPS on high settings COD MW (specs below) to maxing out at 144 (monitor refresh max) with this card.
I don’t think I can upgrade to an ultrawide 1440p monitor and do the same with this card but it will do until I can hopefully snag something more robust. MSFS 2020 went from 20 to 25 FPS all settings on high to averaging about 40 fps.
I really bought this card to improve my MSFS2020 performance but my CPU needs upgrading also at this point with only 4 cores. The whole scalper bot thing has been awful. I’ve never seen anything like this and I’ve been building PC’s since the early 90’s.
Newegg has done a better job than other retailers like BB in mitigating this but these cards are still really hot. I consider myself very fortunate that I got one and will keep buying from Newegg in the future.
To summarize, if you can’t snag a 3080 or don’t want to spend $$$ on a 3090 this card is currently your best bet unless you switch teams to red. I’ll take the Nvidia drivers over AMD any day of the week and I’ve had plenty of both over the years.
Specs: i7 7700k @ 4. 8 (Delidded and replaced with Thermal Grizzly liquid metal) MSI Z270 Gaming Pro Carbon Noctua NH-D15 CPU cooler 32 GB Corsair Dominator DDR4-3000 Gigabyte Vision RTX 3070 OC EVGA 750 Watt G3 PSU NZXT s340 Elite white mid-tower 1 TB Samsung Evo 970 nvme ssd 2 x Samsung Evo 850 500 gb 1 x WD Caviar Black 7200 rpm 2 TB Windows 10.
The EVGA GeForce RTX 3080 delivers the unprecedented performance that gamers crave for 4K resolution gaming, powered by the NVIDIA Ampere architecture. It’s built with enhanced RT Cores and Tensor Cores, new streaming multiprocessors, and superfast G6X memory for an amazing gaming experience.
Combined with the next generation of design, cooling, and overclocking with EVGA Precision XI. the EVGA GeForce RTX 3080 Series presents a new definition in ultimate performance.
NVIDIA GPU Boost, NVIDIA G-Sync ready, 2.75-slot Fan Cooler, Nvidia DLSS, EVGA iCX3 Technology, adjustable ARGB LED, 2nd Gen Ray Tracing, 3rd gen Tensor Cores, 5th Gen NVIDIA Decoder, 7th Gen NVIDIA Encoder, HDCP
Effective Clock Speed
Microsoft Windows 7, Windows 10, Windows 11
Required Power Supply
Three 8-pin power connectors
Power Consumption Operational
NVIDIA GeForce Experience, EVGA Precision X1
HDCP 2.3, DisplayPort 1.4a
Service & Support
Limited warranty – 3 years
Reviews From Real Customers
Being a pilot myself, and an IT specialist, I’ve always been a MSFS nerd/enthusiast ever since Flight Simulator for Windows 95! I’ve always built my rigs around the current MSFS platform so that I could run MSFS with all of the graphics settings cranked up to max for best possible immersion/performance I could get from the Sim.
This card is a reflection of that on my new Microsoft Flight Simulator 2020 PC build! My build started with a Core i9 10850 (ASUS ROG MB), 64GB Corsair RAM and EVGA 850 Watt PS. But I was still using my old GTX1070 graphics card.
I wasn’t paying the ridiculous prices for the current cards. Well finally just recently, with the prices coming back down, I decided to jump! I had two cards in my Microcenter Wishlist that I was watching: The RTX3080 (12GB) and the RTX3080Ti.
When I got to my local Microcenter, I briefly quizzed the sales rep on the two cards I debated on. He quickly offered that unless I was planning on running MSFS in VR (specifically, at 4K), the 3080Ti would be an unnecessary, additional expense.
So, I went with this EVGA RTX3080 FTW3 Ultra (12GB) as I haven’t yet succumbed to VR completely; and additionally saved myself a decent chunk of change. 😛 I am absolutely not regretting that decision! I am running all graphics settings in MSFS2020 at Ultra and this card is EASILY keeping up with the frame limiter locked at 30FPS! With it off I was getting 60-70 FPS! (The human eye can only distinguish up to 30FPS anyway, so I leave it locked at 30 to convert this card’s remaining processing power to smoothness in the Sim).
The 12GB of VRAM on this card doesn’t hurt either as MSFS is VRAM hungry! In conclusion, I’m quite happy with is card; it is a BEAST though. Probably want to check the specs of your case before you go with it.
I have a Corsair iCUE 4000X RGB case and it fits easily. If you’re a MSFS enthusiast and aren’t centered on running strictly in VR (at 4K); this card will not disappoint you at all! I give it my MSFS approval.
This is a monster of a card. Well coming from someone who upgraded from a 1070, this card is huge. The card is built well. I like how EVGA always makes and presents their products as premium and well built.
I would recommend though that you have either good air flow into your case or water cool this card, as she can run hot. I have mine in a Phanteks 500A ARGB and the stock fans have no problems providing enough air flow to the GPU’s fans to keep this card under 60 c under load.
It is a heavy card, so having a gpu bracket or something to balance it would be a plus. It is also quite wide. I had a Lian Li o11 (standard) and the cables that came with my new power supply were rigid, which made it impossible to install this card with the glass on.
If your power supply has very flexible cables or you are using custom sleeves, then it should not be a problem. It was just one thing I found. Others have said that they did not have any problems like I did, so I am not sure if it my motherboard sits a bit further out then theirs with the support bolts.
All in all this a great card. Mine did not come with the plastic red border at the end of the card that appeared in youtube reviews. Mine is just black. The rgb on this card is solid. I am not big on rgb, but it makes the card stand out nicely.
Overall, a solid card. If Ray Tracing or 4k with ray tracing is your meta. then go Nvidia this gen. However, at this point, FOR ME (owning both the 3080 and 6950xt). i would recommend the 6900xt or 6950xt over the 3080 through 3090ti.
UNLESS, most of the games you play utilize ray tracing. Doubly so if you play a lot of ray tracing games at 4k, where these RTX cards really shine. Im sure the fanboys will attack me for this, but right now.
the 6900xt and 6950xt just make more sense. My 6950xt gives me 97% of 3090ti performance, for nearly $1000 less. The value per $ factor here is a big one for me. Thus, while a great card, especially at 4k and RT performance, i would not recommend over the AMD alternatives at the moment.
That said, i am enjoying this card, and seeking out more games with ray tracing to try and take full advantage of its strengths over my 6950xt. I really love the way it looks and i WISH EVGA made cards for AMD.
I have absolutely adored EVGA since the 8800gtx days, i still have the card and it still runs. Will you regret buying it? Nope. But, just be aware that there are, IMO, better alternatives out there for the $.
The EVGA GeForce RTX 3060 12GB provides players with the ability to vanquish 1080p and 1440p gaming, while providing a quality NVIDIA RTX experience and a myriad of productivity benefits. The card is powered by NVIDIA Ampere architecture, which doubles down on ray tracing and AI performance with enhanced RT cores, Tensor Cores, and new streaming multiprocessors.
With 12GB of GDDR6 memory, high-end performance does not have to be sacrificed to find a card for gaming and everyday use.
Dual Fan Design, NVIDIA GPU Boost, NVIDIA G-Sync ready, 2-slot Fan Cooler, Nvidia DLSS, HDCP
Effective Clock Speed
Microsoft Windows 7 / 10 (64-bit)
Required Power Supply
8 pin PCI Express power connector
Power Consumption Operational
NVIDIA GeForce Experience
HDCP 2.3, DisplayPort 1.4a
Service & Support
Limited warranty – 3 years
Reviews From Real Customers
I upgraded from an RX570 4GB to this RTX 3060 and am completely satisfied. I was a bit worried when receiving the package because I heard a rattling noise. Turns out it was just the documentation making the noise.
The graphics card came very well protected. I don’t plan to run anything above 1080 or 1440 so this card suffices for that. If you need higher resolutions then I suggest you get a 3060Ti or higher. 3 year warranty is nice though I could have sworn Best Buy listed a 5 year warranty when looking through the product page.
One of the reasons why I went for this card in particular. My only nitpick is that Best Buy is not selling the card at MSRP (paid about $100 more). I know the market is crazy right now and I’m lucky to have snatched one up in the first place, but I’m buying from Best Buy!! Not some random person or shady online store.
Best Buy should sell at MSRP. Overall, it’s a nice entry level RTX card. It runs cool, doesn’t make a lot of noise, no RGB, 2 fan card with a metal backplate.
I’ve been buying EVGA graphics cards for over a decade and I have never had a problem. They also have the best customer service in the business and it was really happy to find a card MSRP. I’ve been waiting for an upgrade for over two years and I’m really happy to be part of the ray tracing club.
This card is a really good value for the price. I am especially happy with the 12 GB of DDR6 VRAM. That adds a lot of value to this card as it is admitdily not as powerful as the 3070 or 3080, but it has more VRAM than most the higher end models.
Which will help it stay in the game longer. At price point under than $400 This card will last you many years of quality gaming at 2k resolution or below. And if you find yourself craving more power, well it won’t sting as badly when you go for the upgrade.
I think I will be happy with this card long after that chip shortage is over and the market is sorted out a little bit.
Compared to the 2060 super, this is a very lukewarm upgrade. You will enjoy slightly better performance in some games and better performance on DLSS and ray tracing. So I would say it performs like a 2060 super (or 2070 non-super) with 2070 super DLSS and ray tracing.
This card should really be performing at a 2070 super level overall for the price you are paying and considering that the 3060 Ti performs significantly better for only a little more, this is very disappointing.
An upgrade from 6 GB of VRAM (2060/2060 super) to 12 GB probably won’t affect you too much at 1080P but it could be helpful at 1440P later down the road. One good thing too is that it does overclock decently.
I put +200 on the core clock and +700 on the memory and got 7-10% boost in some games. In addition, resizable bar is available for this card and gives an additional boost to select games (nvidia is choosing which games).
To make sure resizable bar works, make sure your CPU is compatible with your motherboard manufacturer. I have a Ryzen 2700X and it worked perfectly fine with a Asus Prime x470 Pro. This was confirmed in the nvidia control panel and gave about a 4% boost to borderlands 3.
If you are into rendering, this coard could definitely be worthwhile as it is significantly better than the 2060/2060 super in this area. I would only recommend this card to someone who can pay close to the $329 MSRP (in this case $389, which was OK).
DO NOT pay anywhere close to the $500 that some 3060 models are going for. You should get a faster 3060 Ti, 3070, 5700 XT, or a 6700 XT at that price.
The GeForce RTX 3050 is built with the powerful graphics performance of the NVIDIA Ampere architecture. It offers dedicated 2nd gen RT Cores and 3rd gen Tensor Cores, new streaming multiprocessors, and high-speed G6 memory to tackle the latest games.
Excellent budget option given the still problematic market and prices. It will handle games just fine at medium to high settings, but very high and ultra definitely seem to give it problems. Cooling is great.
Even when its pushed to its max, the fans barely have to ramp up and are hardly audible and the temps stay well below 70c from every test I’ve run on it. This will be a great card for an average gamer, but is unlikely to satisfy more passionate gamers.
Great card for the price runs very cool under full load it tends to stay under 50 degrees Celsius for almost every game. When idle it hovers around 24 to 28 degrees Celsius. Maybe not the best if your looking for max refresh for every game but with little tweaks you can easily achieve 90 to 100+ FPS for a lot of games.
You can use ray tracing which looks great but it will diminish your frame rate by 30+ FPS for most games. If your looking for a good 1080p card this is the new entry level gaming card fot it.
Boostable up to 1665 MHz, 896 CUDA Cores, Turing Architecture, 4GB of GDDR5 VRAM, 8 Gb/s Memory Speed, 128-Bit Memory Interface, HDMI 2. 0b – DVI-D DL, Dual Fan Cooler. Based on the Turing architecture the MSI GeForce GTX 1650 4GT Low-Profile Graphics Card provides improvements in performance memory bandwidth and power efficiency over its predecessor based on the Pascal architecture.
The front panel of the card features a variety of outputs including HDMI 2. 0b and DVI-D DL. The GTX 1650 is not just about high-resolution gaming; Computationally intensive programs can utilize the GPU’s 896 cores to accelerate tasks using CUDA or other APIs.
For cooling MSI implemented dual fans which maximize downwards airflow and air dispersion to the heat sink below them.
I would recommend the MSI GeForce GTX 1650 to anyone in need of a great performing half-height video card. From what I’ve found, this is the most powerful half height video card as of January 2021. It’s easy to install and will work with the stock 250W power supplies in the small desktop and SFF PCs.
I was lucky and just ordered a second of this video card for my daughters 7010. I was on a waiting list for almost a month before another was available. I’m using these cards to replace GeForce GTX 1050 cards from a different manufacture.
The 1050 cards were glitchy causing the monitor to briefly black out 10+ times a day. Very frustrating when you’re attending school remotely and even more frustrating when playing a game with your friends.
😉 Troubleshooting the issue ruled out heat or a power issue with the pc. Different monitors were tested too. After swapping to the 1650 the black out issue has not occurred once in the past month, confirming the issue was with the 1050 card.
One more to replace. The current Nvidia GTX driver was already installed so the swap from the 1050 to the 1650 was just the hardware.
“newer” systems don’t even see it as an Nvidia chip video card. Shows up as generic video card and loads the standard Microsoft drivers. Nvidia driver downloaded and says no, this is not an Nvidia card.
I even tried this on an MSI motherboard (MSI Geforce video card here) and got the same thing. Tried it on an old Intel Core 2 Quad system and for some odd reason, it shows up there perfectly and loads the driver.
However, the Core 2 quad is way too weak of a CPU to use with this video card for gaming. I was just curious if it would work on anything at all. All systems running Windows 10. Would this work better with Windows 7 ? I can run my low profile GTX 1050 ti in any of these computers without an issue.
I’ve had this card for two months and can’t use it because of a driver issue. I like Windows 7 but was hoping to set that OS aside finally. not sure what the solution is yet. I will update this review if I figure something out.
So far, so good! Would buy from this seller again. Arrived in good condition, ahead of time, and has an easy install for a simpleton like myself. I am no tech wizard, and this is my first time buying parts for my computer.
I have an old Dell optiplex 990 and just wanted something to get my feet off the ground, so a video card was a must. Others were above my tiny budget, while this one was much cheaper in comparison. And it was on the market at the time I needed it.
As a budget gaming graphics card, the Gigabyte GeForce GTX 1650 OC 4G offers good performance at a good price. Due to the small size of the GeForce GTX 1650, it fits into many Mini-ITX cases and is the perfect graphics card for upgrading older systems.
With a stated power consumption of 75 watts, the reference design of the nVidia GeForce GTX 1650 also doesn’t require any additional power supply. The foundation for this compact and efficient performance is the TU117 graphics chip based on the Turing GPU architecture.
This is the heart of the GTX 1650 and is manufactured in a 12 nm process. This model is equipped with 4 GB video memory with the GDDR5 specification, which is connected to the graphics chip via a 128-bit memory interface.
The 1. 485 MHz base clock of the TU117 graphics chip of the Gigabyte GeForce GTX 1650 Mini ITX OC 4G is automatically increased to a 1. 710 MHz boost clock by GPU Boost 4. 0 if the graphics card requires additional computing power.
And thanks to an effective dual fan cooler design, the Turbo Boost lasts longer, so the graphics card delivers optimal performance. The 4 GB video memory connected to the chipset via a 128-bit memory interface clocks at 4,000 MHz (effectively 8,000 MHz).
The 19. 1 cm long graphics card is supplied with power via a PCIe slot and does not require an additional power connection.
The most expensive component in my first ever PC build. Due to the GPU shortage of 2021 and insane scalper pricing, I settled on this video card, which I paid $358 despite having an MSRP of $149. Crazy, I know, but I really wanted to play some good games.
The card is very easy to install and works immediately after installation, with no need to install drivers, though updating drivers was necessary. I have been using it for 4 months now paired with a Ryzen 5 5600G.
The GPU overrides the Ryzen APU, so no need to configure anything. It just works. I use the DisplayPort port, which allows for an NVIDIA GPU to output Vsync to a monitor. The GPU is overclocked out of the box and I have had no issues whatsoever with overheating.
I have gamed for a few hours at a time and never observed the temperature go higher than around 75 C and I only have ONE case fan (in the rear) running at just around 300 RPM. My PC case doesn’t even have significant airflow in the front panel, either.
Just a large vent on the top. And everything has been fine with cooling. I have played Disco Elysium: The Final Cut at normal settings and GoW on low settings and the game looks absolutely amazing, still.
Performance in the game was excellent, despite a few occasional stutters and freezing here and there. But it was so occasional/rare that it didn’t really detract from the experience at all. Unfortunately, however, this GPU is not powerful enough to run Elden Ring, even at low settings.
I will probably eventually upgrade to a better GPU after some time and when it becomes more affordable. But considering this was my first ever PC build during a pandemic with supply chain issues, it generally works well for most modern games, at low or normal settings.
I paid $170 OTD for this GPU in October, 2020 before prices started skyrocketing due to supply shortages and scalping activity, but even at $170 I would rate this card 3 eggs, and would rate this one egg if I paid $300+.
After 2 months both fans started becoming unbalanced and started making a loud whiney noise. At this point GPUs skyrocketed, so I emailed Gigabyte and asked for replacement fans and told them I would repair the card myself.
It took them almost two months to respond, but they did send me both replacement fans, no questions asked. Three months later, the replacement fans started having the same issue again, and at this point I just uninstalled the fans and underclocked the GPU so it wouldn’t overheat.
I was doing engineering work and could not afford to lose time, or buy another card which doubled in price. Two months after I uninstalled the GPU fans (keep in mind my GPU never shut down due to overheating issues and I even lowered the max temp auto shut-down tolerance to ensure a longer life for the GPU) one of my HDMI ports started failing and losing signal.
I used multiple cables etc. etc. to diagnose the problem. I came to the conclusion it was an HDMI port issue in the GPU. I only use a dual monitor setup, so I can still use this card, but it has been problematic since the start.
I regret not being able to RMA the entire card for a replacement but due to what I mentioned before, I was stuck with it. With the current price for this card, I wouldn’t recommend anybody purchasing this, but for $170 I’d say go for it.
It’s very likely I just got a sub-standard card and got unlucky.
First of all let me clarify my rating, if I was judging the GPU alone it would get a 5/5, hands down. The amount of power you get in such a small form factor is incredible, really shows how far graphics technology has come in the last 10 years.
The fact that you can slap this card into nearly any system and be able to play most games on high at 60fps in 1080p should’ve been Nvidia’s marketing angle for the 1650. That being said, this is by no means an enthusiast card.
If you have enough space/power in your system, definitely get something else, RX 580, 1660 Ti, anything else. IMO the only use case for the 1650 is in SFF PCs, that’s where this card *really* shines.
If you only look at frames per dollar, the 1650 blows, no question. However if you look at frames per watt, the 1650 absolutely decimates anything near it. That being said, I am absolutely appalled at the lack of effort Gigabyte put into the thermal design of this card.
The dinky piece of aluminum they call a heatsink struggles to keep the 1650 cool under load and thermal throttling very rapidly becomes a problem. Overclocking is out of the question without raising the thermal limit to dangerous highs.
I was able to get the card stable at 1850MHz but that was only when I had the thermal limit set at 90C in MSI Afterburner, and you bet it was locked at 90C for the entire benchmark. This might sound silly, “yeah no dip the card runs hotter when its overclocked” but hear this, the Zotac GDDR5 LP version of this card I had before was running stable at 1985MHZ, and the thing never broke 85C.
The heatsink on that card is also entirely aluminum, but it was much larger, thicker, had more fins, and was overall designed a lot better. And I get it, the 1650 isn’t going to get the same premium treatment as the flagship GPUs.
I just would’ve thought the company behind the legendary Windforce series of cards would have been a little bit more thoughtful in designing the thermal solution for this guy (especially considering it costs the same as their full-size 1650s.
) Definitely knocked my opinion of Gigabyte down a notch or two. Everything considered, I still absolutely recommend the 1650 for small system builders, just not this card specifically. If you can, I would wait for more board partners to release their versions of the LP 1650 D6 and go with one that has a more robust cooling solution (looking directly at you, Zotac).
However, if you just gotta have the LP GDDR6 version of the 1650 right now, this card works alright. Not great, but alright. I will most likely be selling this card once other, properly cooled LP GDDR6 1650s come out and buying one of those instead if that says anything.
Overall, I’d give this card a 6 out of 10. Love the low profile implementation of the GDDR6 1650, but I just can’t overlook how badly designed the thermal solution is. Really holds back the potential this thing has and as an enthusiast it breaks my heart.
I’m not a gamer. I bought this because I have an old Dell Inspiron 531-S with a Win10 upgrade that needed something better than the onboard graphics unit which staggered along playing any type of video.
I installed it with the stock 250 watt power supply that came with the 531 PC (because the upgraded one I bought was DOA and I figured lets see how it works. ) And it does. I can watch TV shows, watch and edit my recorded videos etc.
, etc. Prior to the upgrade the onboard graphics adapter had all it could do to keep up with any streaming. Matter of fact, that fancy windows saver with all the spirals couldn’t even run smoothly. Some hints about installing.
It comes with the MSI install CD. It has a lot of add on things I just don’t need, so I opted to install the drivers directly from Nvidia. There is an option in their setup that allows you to do a clean install.
Use it. Prior to installing the drivers, I booted the PC into BIOS setup mode and changed the advanced settings for graphics to use the PCI-E slot for initial graphics boot-it saves some time when you boot your PC.
The PC will boot in native resolution and after the drivers are installed, all looks normal again. BTW, this GPU supports DirectX 12, a real benefit. My PC exhibited some initial power on issues where the monitor power indicator wouldn’t come out of the amber mode, so no video displayed.
Powering cycling the PC fixed that. Not sure if it was a driver issue or not, but Nvidia had a new driver and I installed it, still waiting to see if it fails again. If it’s not that, well, maybe a new PSU is in order.
By the way, that cheap 300W supply you see from Marketplace Outlet for less than 25 might not be in your best interest. Another BTW. go to techpowerup. com and get their video card analyzer. It’s free and will tell you everything and more about your video card and how it’s running.
Its called GPU-Z And another Freebie, look for Speccy from CCleaner. It’s free and will tell you everything you need to know about you PC, including all those missing Microsoft Keys for your software.
I downloaded the latest driver from nVidia, installed, and was up and running in no time. I used the included low profile slot adapters to install the card into an HP 8300 SFF computer. This computer will be used as an HTPC (Home Theater PC).
The card includes HDMI, VGA, and DVI-D connectors allowing any monitor you have to work with it. Under load (streaming video from Internet or my home server) the card never gets above 95 degrees F. The built-in video in the HP 8300 computer actually has ‘slightly’ better performance.
However, since the computer only has an available VGA and DisplayPort connector I needed to install this card. My TV only supports HDMI (VGA too, but yuck, pushing 1920×1080 with VGA yields a bit of noise in the signal since VGA’s max resolution supported is 1920×1200).
Digital is the only way to go with HD resolutions or higher. Oh, one other thing, when you install nVidia drivers make sure to un-select everything except the drivers themselves. Even then a ‘Telemetry’ and ‘Container’ service will be installed by default.
Simply use MSCONFIG to disable these two services. There will be no impact on the performance of the card. These are cr@p-ware services included by default with the drivers. Not needed at all.
If you’re considering upgrading an older DELL 990 or similar compact desktop PC and you don’t need a high-end 3D GPU video board with three fans and a big price tag, I would recommend the ZOTAC GT 710.
Installation was easy. The board comes with a full-size bracket installed and a compact case bracket is included in the package. It’s quiet. With a large passive heat sink, there is no worry of a fan bearing failing (and the annoying sound it makes as it dies).
Drivers are included on a CD, but Windows 10 recognizes it. Download the most up-to-drivers from NVIDIA. There are a few informative reviews on YouTube regarding the ZOTAC GT 710. If you have a few more bucks to spare, you might consider the ZOTAC GT 730 for a slight performance improvement.
The WINDFORCE 3X cooling system features 3X80mm unique blade fans, alternate spinning, 5 composite copper heat pipes direct touch GPU, 3D active fan and Screen cooling, which together provide high efficiency heat dissipation.
I am very happy with my purchase. My particular card overclock extremely well. I am able to get very close to the performance of the 3070 I sold like a dirty scalper for almost twice what I paid, and suffered a year with a 1060 3gb 😂.
I am able to get a stable OC scoring 12403 (my 3070 stock got 12650 stock and didnt really overclock well at all) in time spy using simple config in MSI Afterburner: 104% power, +175mhz offset and an incredible +950 memory offset! This allows me to crank all of the games I play at 1440p with the help of DLSS on quality settings in 1440p and get 90-120 fps (Forza 5 Shadow of the Tomb Raider) and even cyber punk 2077 with cranked details and ray tracing at playable frame rates (consistently above 60) paired with a 12600kf.
I cant guarantee other cards will do as well as it is always playing the silicon lottery, but most reviews and posts Ive seen confirm that these gigabyte cards especially are capable of incredible memory overclocks, and pretty decent core overclocks, their most limiting factor being power as my card tops out at 215 watts with just one 8 pin which close to the absolute max and means Im constantly drawing 65-75w from PCI slot on mobo, and 150 from the cable! Wish I could see this card with one more 6 pin but that only comes on the PRO and in this market theyd probably just go ahead and charge you for a 3070 cause it will def get VERY close to, or even 3070 stock performance with just a little more power.
I am needing to re-think airflow however, as the card stays under 85 degrees but with overclock applied it sounds like a whiny jet engine doing so currently! I think I have too much exhaust and not enough input in my H510 Flow, so hoping that can be fixed.
Would be pretty silly to consider custom water cooling a 60series card that I already spent too much on but them fans!!!!. Id much prefer 2 120mm fans over 3 70s. Smaller fans have that distinct whine when they soon fast that sounds right out of 2004 gaming PCs RGB is fun, but wish the RTX logo lit up, and or something on the backplate besides just the big Gigabyte logo.
Id say for most people as of this moment in time, the 6600 is probably a better buy, can find them pretty regularly for sub 350, but if you like ray tracing and are ok with DLSS to help you along this card is not a bad choice.
Just have to be willing to pay the NVIDIA tax. Of course if you can keep waiting do it! If you have a 1080, 2060super or above, or 5700XT or above just keep holding out a little longer! You missed your window to sell those for $700 so hold out till intel finally enters the market! Probably wont even buy an intel GPU for at least a few years, but hopefully it helps drop all GPU prices!.
A decent GPU, if you’re able to pick it up. It’s a bit more expensive than an FE card, but sometimes that’s all you’re able to get. It runs Destiny 2 and Halo Infinite near max settings at an average of 90 fps, at 1440p.
The thermals seem to be good; I’ve yet to see my card break 72 C. The screen cooling forces a lot of hot air upwards and can make the top of your case feel hotter than usual, depending on your cooling setup.
Overall, a good card. I haven’t had any issues with it, but at around $600, it’s kind of tempting to shoot for a 3070 Ti FE instead. Then again, if this is the only card you’re able to get, it’s not too bad a deal.
I built a new PC, which includes this card – I upgraded from a GTX980 and i7-6700 build that I put together in 2015 – it just wasn’t performing as well as I wanted to with the current generation of games.
All I have to say is that this card is absolutely fantastic. I am running this GPU paired with an i5-12400F and it performs brilliantly for what I need it for. It is quiet, has incredible performance, and is super good looking with the RGB’s on two sides of the card.
I am able to run Hell Let Loose on Ultra with well over 100FPS, Squad at the highest settings – averaging 110 and Tarkov with over 100 as well. Would recommend this card to anyone, if it is in your price range.
MSI Radeon RX 570 Armor Overclocked Dual-Fan 8GB GDDR5 PCIe 3. 0 Graphics Card Game in style & dare to be different with MSI’s unique ARMOR graphics cards. Inspired by advanced armor shielding with a classy black & white finish.
ARMOR graphics cards are perfect for gamers and case modders who are looking for something different. This is where gaming meets class.
***NOTE***: This goes out to all of you people who keep having the “black screen to freeze” or “freeze and forced to hard boot” or “PC shutting down” issues. I had the exact same issue, only I did a proper troubleshoot before leaving a negative review.
***IMPORTANT***: CHECK YOUR POWER SUPPLY. This card uses slightly more power than a typical GPU. Everyone says they have tried everything without success, yet NO ONE MENTIONED ANYTHING ABOUT THE POWER CONSUMPTION.
If your power supply is less than 600-650W, you are going to need a better one, hands down. GPU’s only use as much power as they need in the moment, depending on the load. The higher the load (ie. playing a graphically intensive game vs.
checking your e-mail), the GPU uses more and more power as it is needed. If your power supply doesn’t have enough power for the GPU while it is handling heavier loads, your power supply shuts off because you have exceeded the “ceiling” of how much power is available.
That is your PC shutting off, not the card. TL:DR – Make sure your PC has enough power to supply to the GPU before bothering MSi with an RMA and leaving a nasty review. The cards aren’t DOA, the people using them are.
Learn some basic computer knowledge and logic before automatically blaming them!!!!.
So I was looking for the ultimate budget card when I made my first build, and this is what I found. At the time it was going for nearly 170 dollars and was comparable in price to 4 GB models. This card does what it is supposed to do.
It playes games well at 1080p60, and you can crank up the settings decently far before you run into drops below 60. You can even max out a few titles if you want. It is clear the cooler is quite cheap as there are parts on the front of the card still exposed while there is no backplate at all.
\The MK2 addresses this but the front part of the cooler still doesn’t quite cover everything. This card runs a tad warm, and the stock fan curve is terrible, at least if you max out the power slider, which I’d recommend.
It has to ramp up the fans to 3000 rpm at otherwise stock settings to keep it at 60. You can turn it off though and set a much less aggressive one. Mine run at roughly 1700-1800 RPM and keep the card under 70 degrees, usually sitting at mid 60s under load, 70 being the peak it hits.
I recently saw an 8 GB RX 580 from Sapphire with a backplate and a nice cooler for less than what I paid, and it normally goes for 180. I’d highly recommend spending a little bit more money on a nicer card than cheaping out this much.
However, if your budget is that tight, and you need a solid 1080p card, this is the one for you.
Bought this used from a miner and flashed it back to factory firmware, works good. Seems temperamental to heat (65+ C) and OCing, but it’s already the OC edition so I just left it alone. Ultimately it gets similar FPS with more stability than my gtx 1060 3gb.
1. 5 year update: It appears to be unstable in my system with an ASUS B350 resulting in games crashing. It doesn’t crash in my other system which is almost identical except it uses a Gigabyte B350 board.
However, it occasionally produces a static screen upon waking up which requires a hard reboot. I have tried multiple flashes and firmware versions with similar results. About a year after purchase I had to replace the thermal paste due to rising temperatures.
At that time I installed the RX 580 firmware on it, since they’re basically the same card, and my PC now recognizes it as a 580. I can’t say I noticed any performance differences.
Based on the Ampere architecture and designed to handle the graphical demands of 4K gaming and high frame rates, the Gigabyte GeForce RTX 3080 VISION OC (rev 2. 0) Graphics Card brings the power of real-time ray tracing and AI to your PC games.
The Ampere architecture features 2nd Gen RT Cores and 3rd Gen Tensor Cores. The GPU features 10GB of GDDR6X VRAM and a 320-bit memory interface, offering improved performance and power efficiency over the previous Turing-based generation.
The front panel of the card features a variety of outputs, such as DisplayPort 1. 4a and HDMI 2. 1. HDMI 2. 1 supports up to 48 Gb/s bandwidth and a range of higher resolutions and refresh rates, including 8K @ 60fps, 4K @ 120fps, and even up to 10K.
The RTX 3080 is not just about high-resolution gaming. Computationally intensive programs can utilize the GPU’s 8704 cores to accelerate tasks using CUDA and other APIs. This graphics card is the LHR (Lite Hash Rate) version.
The value of the Hash Rate is for reference only and may vary depending on your PC hardware specifications, quality of your internet connection, and/or other factors. For cooling, Gigabyte implemented the WINDFORCE 3X cooling system.
It features two 90mm fans and one 80mm fan. The fans feature unique blades, where the airflow is split by the triangular fan edge, and guided smoothly through the 3D stripe curve on the fan surface. With alternate spinning, the fans reduce turbulence and increase air pressure.
The 3D Active Fan provides semi-passive cooling and the fans will turn off when the GPU is in a low-load or low-power state. There are also seven composite copper heat pipes and a large copper plate with direct contact to the GPU and VRAM, which efficiently transfer heat.
The extended heatsink design allows air flow to pass through, providing better heat dissipation. It also has RGB Fusion 2. 0, which has 16. 7 million customizable color options and a variety of lighting effects.
You can also use it to synchronize with other AORUS devices.
i’ve always bought 2nd tier cards as the top tier were always out of reach pricewise, and then i run them for years. TNT2, GF2ti, 6800GT, 260GTX-216, 670GTX, 970GTX. then i could only dream of the 1080 and 2080’s after the crypto miners smashed the prices of those generations of cards out of my ballpark.
so my 970 finaly got retired for a 3080 after promising myself a treat for ages. so, pick any block of two words. Do It! – Get One! – Need This!running a 2560×1600 screen as my main, the 3080 feeds me butttery silky smooth visuals.
i want another one for my secondary rig that my 970 is now powering, as in comparison, (same screen/resolution) the 970 just can’t handle it. i may wait a few generations to make the jump to buy a new card, but i am NEVER going to do myself down again on the premise of saving a few bucks if this is what top tier life is like – and i only buy every3-4 gens when the performance is worth it.
speaking of worth it, to my mind, the 3090 at 15% extra performance for double the price is just for brags for the money from what i can see. But if that extra 14gig VRAM and smidge extra performance is what you need, then it’s worth it to you – so be it.
But for me, the 3080 is where it’s at. Get one. Do It. Do It Now!Card Cooling is quiet, even with the side of the case off. Just be sure your PSU has got the chops and the proper connectors. not just some no-name cheap-o brand with a couple of dodgy something-to-8pin PCIe adapter cables.
this thing pulls juice. Oh and Overclockers, Solid service and info as always :).
Dude this little beast (yet still very large) is ungodly fast, it’s at least twice as fast as my 2070 and at twice the resolution! Everything on Ultra high, as High as graphically possible and literally not even a hiccup! So far every game I tested and 3DMark including the ray tracing benchmarks.
Supposedly you can overclock it too, but there’s really no need to. yet. Right now it’s nice quiet cool! I got to buy a new PCIe USB C 3. 1 for my VR though. Or maybe a DisplayPort to USB-C? There’s not one like there was on my MSI Geforce RTX 2070 Armor.
I was lucky enough to get one of the early December combos and have been using this 3080 since then fairly heavily. I originally got it to play Cyberpunk maxed out at 1440p, which it does very well, albeit with DLSS.
It also plays Red Dead Redemption 2 maxed out very well, and all the Total War games, Warhammer 2, Three Kingdoms, and Troy. Keep in mind those games are CPU hungry as well, so this GPU needs matched up with a good 6 or 8 core CPU.
Don’t cheap out of a CPU or good mobo when you spend $800+ on a high end videocard. As for looks, I’ve never had a card this attractive. It looks futuristic and will probably age well because it doesn’t have all sorts of gamer stuff on it.
Just a good clean design. Since February, I’ve been using it to mine Ether through Nicehash in off hours, which it has also done well. Keep in mind Gigabyte doesn’t do the best job providing cooling to the GDDR6x, (a problem for most of their cards) so the fans need to run 100% to provide adequate cooling.
No idea what this will do for fan life, but it’s important to point out.
VR Ready Discover next-generation VR performance, the lowest latency, and plug-and-play compatibility with leading headsets-driven by NVIDIA VRWorks technologies. NVIDIA Ansel Turn your screenshots into art with this powerful in-game photo mode that captures 360, HDR, and super-resolution photos NVIDIA GeForce Experience The essential companion to your GeForce graphics card.
Capture and share videos, screenshots, and livestreams. Keep your drivers up to date and optimize your games. NVIDIA G-SYNC Compatible Get smooth, tear-free gameplay at refresh rates up to 240 Hz, plus HDR, and more.
This is the ultimate gaming display and the go-to equipment for enthusiast gamers. Game Ready Drivers Get the highest levels of performance, and the smoothest experience possible, from the moment you start playing.
Microsoft DirectX 12 API, Vulkan API, OpenGL 4/5 Power new visual effects and rendering techniques for more lifelike gaming. HDMI 2. 0b, and DisplayPort 1. 4 The latest standards in DisplayPort and HDMI interfaces.
Bought this to upgrade older HP Envy with AMD FX-6350 (about 7 years old) to play World of Warships. Stock video was 24fps with medium-low settings. This card rocks at 78fps at max settings. No more sluggish response, or weird video artifacts.
The game video quality is stunning at max settings. This is a perfect video card to update an older PC without extra heat and fan noise.
I was always at the cutting edge of technology until my priorities changed (circa 2010). Now with COVID I get to sit back and game a little, but then my vid card crapped out. This little gem is worth the price as it keeps up with the current games and all at a pretty low price.
This will hopefully last me until this computer craps out.
The conclusion of small form factor graphics card is that they are an excellent choice for gamers who want the best performance possible in a smaller package. They are also a great choice for those who want to save money on their electric bill, as they consume less power than larger graphics cards.