So this got announced:

This is Valve's latest assault on the console market, a six-inch cube that's actually a mini PC using everything they've learned from the Steam Deck. It promises six times the power of the Steam Deck, offering 4K gaming at 60fps on current titles and obviously works with your existing Steam library. Out early next year, apparently.
I think it's really interesting, and the timing is very telling. Obviously this is a massive shot across the bows for Microsoft, who have been making noises about their next console effectively being an Xbox-branded PC: Valve are clearly aiming directly at this market, and honestly having seen what they've done with SteamOS and the Steam Deck I would trust them to deliver great gaming performance far more than I would with whatever generic Windows box MS decide to cook up. I've been all in on Xbox this gen but clearly I'm going to need an exit strategy and I love my Steam Deck so I'm definitely in the target market for this.
I've got a couple of concerns though. Firstly, it appears slightly underpowered, i.e. it's coming in at about Series X/PS5 levels at precisely the point that those consoles are reaching the end of their lifespans. If this was more like the start of next gen in terms of actual horsepower then I'd be all in, day one, but that doesn't appear to be the case. And secondly, it's still a PC, and while Valve have done wonders with the Deck in terms of making it a seamless console experience, there's no getting away from the fact that some games still require quite a bit of tinkering. I don't mind that, but it does mean this is always going to be an enthusiast product rather than anything mainstream.
Not going to lie though: I kind of want one. Anyone else?
I'm not getting one, just because I have a PC that's pretty much on par with this box. But if it does well, I'll almost certainly get Gabecube 2 as this PC's replacement.
The 4K60 stuff must be bollocks, based on my experiences over the last year. My 8GB GPU wouldn't even dream of a stable 4K60 with ray tracing, even with fancy image reconstruction wotsits. But if they've somehow managed to crack that (they haven't), and it costs under 500 quid (it won't), then this will be a must-buy for a lot of people I reckon.
It's good news for PC gaming in general. If this thing sells bucketloads, everyone with moderately powered PCs (which is most of us) will suddenly become the target benchmark for developers. Maybe they'll finally stop forcing UE5 on everyone.
I think DLSS or similar upscaling / reconstruction is just a given these days, really, for everything. Native 4K is incredibly demanding and honestly not worth aiming for.
While I'm curious about the λbox (most of my PC gaming is on the lower-end of the graphics spectrum anyway, so the lesser graphical oomph isn't a dealbreaker, depending on price, obviously), I am very intrigued by the controller. The first-version Steam Controller was an interesting curiosity but ultimately a failure and not that nice to use. I'm guessing that their Steam Deck experience has taught them a lot about improving the button layout and the general feel of the thing. And if it means I can play Timberborn on the sofa, even streaming onto the Steam Machine I still have, then I'm in.
Someone explain the slight dooming over the 8gb vram
It's largely the fault of Digital Foundry, who claim that using 8Gb of VRAM is only one stage removed from murdering baby seals.
So if I have this right it just means we wont be running shit in Ultra 8K or whatever, just 'merely' 1080p? Or does it affect game performance too, so like, 30 vs 60 fps?
It's just the video memory which allows the graphics card to store and quickly access graphical data like textures and frame buffers, which improves performance and visual quality. If you're running in native 4k, naturally the requirement for more VRAM goes up, as it does if you pile on more advanced visual features. Once the VRAM is full, many games will suffer performance problems, like stuttering or texture pop in etc. Modern upscaling technologies like FSR and DLSS allow you to have a "4K image" but without some of these penalties and framegen can insert fake frames to boost performance (at a cost in lag/responsiveness). Assuming the GabeCube is around a mobile 4060 in power, I think it'll be capable enough in most games at 1080p native, medium detail, and then using framegen/upscaling you might be able to push it further.
There aren't many modern titles that are backbreaking for a GPU, and those that are, are normally poorly optimised. Avowed is a good example, it probably wouldn't run that well on a GabeCube. I think people's main concern will be about the flood of upcoming UE5 games, that, to date, have followed Avowed in running pretty shockingly on midrange hardware.
Thanks for the explantion, cav, seems I was sorta' getting it. Doesn't sound like something that will bother me I think. But man people seem worried ha ha.
UE5 and Avowed in particular seem to get held up as examples of poor optimisation - is that just a PC thing? Avowed looked and ran great on Xbox, as have other UE5 games like Expedition 33. So it's not a universal thing.
Avowed, Immortals, Oblivion, Stalker 2 and other UE5 titles have a lot of problems, but you could argue that sometimes unless you're totally Digital Foundrying it, it's not that perceptible to a mortal.
Any fixed platform normally has an element of just accepting what you get, while on PC you can drive yourself insane tweaking performance. But I found Avowed and Immortals particularly egregious on PC as choppy frame times and visual instability was impossible to get rid of on the Ally or even my still fairly beefy laptop. Once I brute forced it in my main PC with a 5070 it ran pretty well, but that's a bit ridiculous. At launch I know the 30fps and 40fps modes on Xbox were good, but the 60fps mode was a bit of a shimmering mess (at least at launch). Might be better now?
Immortals too isn't great, but the dev folded, so a lack of patches probably isn't helping. Stalker 2 is a disgrace - for how it looks the performance is absolutely shocking on anything other than a top end PC.
'For how it looks' is the biggie for me. I totally get the performance hit when I can see a reason for it. The lightning in Assassin's Creed Shadows is an obvious one, I can totally see why there's a trade-off there. It looks stunning in the 40fps mode.
But some of the stuff being churned out on UE5 looks worse than older, more performative games. I'm playing Deus Ex: Mankind Divided at the moment, which is a lovely ten-year-old game with lots of reassuring boomer graphic terms like baked lighting and MSAA. It runs at 200fps on my computer, without any DLSS magic. And it genuinely looks better than these new UE5 titles that are struggling to hit 60fps at an internal resolution of 720p.
I have zero understanding of how these engines work, so I might be way off, but to me it feels like another case of reducing man hours and ending up with something objectively worse, but cheaper to make. (Not cheaper to buy, of course.) Because presumably, someone used to spend months 'baking' the lighting on these games, and now it's all done in real time. If that's not the reason people keep shipping games in this state, then I'm at a loss.
The Global Illumination part of ray tracing (RTGI) is the bit that really makes a difference visually, and you're right, it does in theory allow devs to spend much less time baking in lighting, but it also can look really, really good so I can see why people are pushing for it. In theory the current generation of consoles are a little underpowered for it which is why we've seen a lot of 30fps modes over the last few years, but games ARE starting to appear that use it well "by default" even on console and honestly you can't say that Indiana Jones or Star Wars Outlaws look bad in any way.
If it (a) looks better and (b) allows devs to focus their efforts on other parts of the game than methodically doing manual lighting passes, then it's a good thing. But it might be NEXT gen before it really becomes the accepted way of doing things.