I know the title looks insane, but hear me out.
I wanted to make a game, specifically an isometric 2.5D RPG game, nothing that fancy.
So I decided to write it in Rust because I… like Enums, or something, and I had already finished the concurrency chapter, so I should be able to hang on.
Bevy seemed like the most complete engine out there, if overkill for my use case, but it’s presumably so modular I could de-bloat it adequately, But…
Someone I know has a laptop; it is old.
It is not super old, a Toshiba Portege R700 with a locked BIOS which took me 3 days to trick its Windows 10 boot loader into booting Debian, and accidentally, yet irresponsibly, broke Windows and installed Grub.
There is no reason the thing shouldn’t be able to run my hypothetical game. I’ve personally seen it LANning Sven Co-Op (Half-Life Co-Op) not long ago; the beast could maintain 60FPS for a solid few minutes, before overheating and dropping to 5, but it should hold on better now that I installed GNU/Linux.
So I, just to make sure, ran the command that exposes the OpenGL version used by Mesa, and… Open GL (ES?) 1.5? I surely, need a plan.
I noticed this issue with many indies. Many would-run-on-a-2005-thinkpad games sacrifice this untapped market for features they never use, or worse, go against the artistic vision.
So, since Bevy is modular, can I, a humble would-be-intern, write a rendering backend for the 2003 specification, but not before learning what a rendering backend is?
Or can I, a seasoned searcher, find a premade solution solution for Bevy or any other Rust engine, made to use the 2003 spec, or even the 1992 1.0 spec?
Or would it be worthwhile, to construct an engine of mine?
Edit: Or can I, a man of determination, write an FFI for an existing C solution?
Google says you have a Core i7-620M, that’s a Westmere, first-generation Intel HD Graphics, provided everything is configured correctly you should have OpenGL 2.1.
Distributions might not include drivers that old by default, you should look into that.
…and, no, you probably really, really don’t want to use OpenGL 1.x. If your implementation supports shaders at all then it’s going to be some vendor-specific non-standard thing. If you want to do anything 3D then 2.0 or 2.1 is the bare reasonable minimum (2.1 gets you PBOs).
If you’re doing 2D well I don’t think it would be wrong for bevy to have a 2.5D engine that runs on anything from a flat framebuffer (that is, render on CPU) to Vulkan. Sprites, parallax layers, such things. Very old boxes will need custom code, tightly-timed assembly at that, once upon a time it was considered impossible to get smooth side-scrolling on VGA cards – until Carmack came along. Commander Keen was the game that broke that barrier. In case you’re into the arcane craft of the elders, here’s a book.
Bevy heavily relies on winit, first step I’d say would be to try to get plain winit to give you something, anything to draw on, play around a bit, then get bevy into the mixture disabling pretty much everything that’s not the ECS or input handling because none of the rendering stuff will work, or even be portable.
Google says you have a Core i7-620M,
No, I have an I5, it even has a sticker, but my I5-7500 desktop has an I7 sticker, so stickers aren’t reliable. (I have better evidence than the sticker, trust me)
that’s a Westmere, first-generation Intel HD Graphics, provided everything is configured correctly you should have OpenGL 2.1.
I rechecked 1-2 days after making the post, and Mesa reported 2.1 & 2.0 ES, which is consistent with techpowerup, that website never failed me even when I failed myself.
…and, no, you probably really, really don’t want to use OpenGL 1.x. If your implementation supports shaders at all then it’s going to be some vendor-specific non-standard thing. If you want to do anything 3D then 2.0 or 2.1 is the bare reasonable minimum (2.1 gets you PBOs).
I think shaders are a 3D-only thing? which shouldn’t be useful for an isometric game. My favorite game runs on DirectX 3.0a, so I 1.x should work, I guess?
Also I will look up PBOs, that acronym looks cool!
If you’re doing 2D well I don’t think it would be wrong for bevy to have a 2.5D engine that…tightly-timed assembly at that, once upon a time it was considered impossible to get smooth side-scrolling on VGA cards – until Carmack came along…
about 1-1.5 weeks ago I watched some Johnathan Blow hating on programming paradigms and game engines, and to be honest, I had already been viewing pre-built engines as things that produce games that are… familiar… they lack souls, integer/buffer_overflows, and their physics are too well-made. I understand the benefits of a prebuilt 3D engine; I played a 3D game on DOSBox and saved before every jump - except the ones that got me stuck in a wall, in addition to an indie 3D C++ mess on Itch (it’s the only game I am excited for its release at the moment, as it’s still in beta).
I also saw an attempt at using SDL as a Bevy renderer-backend on Github, and there were problems according to the README.md, but I had lost interest by then, because…
After watching much JBlow, I came away with 3 things:
1- OpenGL isn’t worth learning anymore, but don’t use SDL to abstract yourself from it, use raylib. That’s his opinion, so I decided to build my engine on Macroquad, a raylib-like thing, as using rust-bindgen feels un-ergonomic.
2- ECSes and other pre-forced paradigms are meaningless, just write the game.
3- He doesn’t like Rust, which I ignored.
You might think Rust is a bad choice if I like overflowing things, as Rust is safe, but before you is something I wrote semi-seriously (I say “semi-” so I don’t get laughed at) a day or two ago:
let mut key = Key::new(500.0, 500.0, 50.0, PINK, unsafe {
let x = &entities[9] as *const Box<dyn Entity> as *mut Box<dyn Entity>;
// SAFETY: This is straight up undefined behaviour, I have a solution
// that is better, simpler and safe, which is seperating the entity list
// and `GameContext`, but I wanted this to exist in the git history.
//
// And before I sleep and forget, the way is passing `&mut door` to
// `update()` manually, and passing `entities` to `walk`.
//
// A `RefCell` wouldn't have complained at runtime anyway.
#[allow(invalid_reference_casting)]
&mut *x
});
As you can see, using anything remotely similar to Arc<Mutex<T>>
would get me accused of having skill issues, which is extremely bad considering I am one of those ‘I use NeoVim/Colmack-DH/Rust/Linux BTW’ people, so I prefer extreme measures, AKA just regular-C-code style of writing, so it won’t be long before I meet some REAL undefined behavior.
If you’re interested in my little project, it’s about a yellow square in rooms of yellow walls, which I planned to write in 3 days, which I finished today, a few hours after Fajr prayer, one week after I set the deadline to 3 days.
Time to add a 3D turn-based combat system to it for reason, I first need a flat yellow cube, and some doom-like fake-3D enemies.
In case you’re into the arcane craft of the elders, here’s a book.
Thanks, I wish to write assembly sometime down the road, and discover these treasures of knowledge.
Edit: in case you’re wondering how I solved the code problem, I started thinking and realized that I wanted to pass a to
Key.Update()
, but I was also holding a inside of
GameContext
, which is impossible in safe Rust, so I changed it hold a instead.
They say: “Make it work, then make it right, then make it fast”, but maybe they didn’t mean jumping to unsafe
when they said “work”.
Edit2: I forgot to say something about prebuilt engines. In addition to creating an inconsistency between quality of the engine (physics and UI interactions), and the quality of the game-specific code, they are also very bloated and that’s reflected in binary sizes and the performance penalty of being a general-purpose engine.
Macroquad is also an offender of this, as a mere cargo add
bloats the release binary up to 70 whole Mibs, but I managed to shave off 10 Mibs in a few hours by cloning the repo and “forking” Macroquad and it’s huge math library, grim
, and “de-bloating” them, and my game would still compile fine, so there’s much room for optimizing the binary size, as well as performance, since I am low-level enough to implement my own delta
time.
Also I am crazy about performance, which can be traced back to trying to run games on an HP compaq DC 5800
for 4 painful years, whose GPU is even worse than the laptop’s.
PBO