Not exactly a game, but here is something like a 3D proof of concept I worked on yesterday and finished up today. It works most of the time and I already figured out better ways to redo everything, and things to add like strafing, but I figure it is still worth posting.
Blocks are always facing the player, their angle determines whether they are within the player's field of view, if they are beyond a 90 degree window based on the player's viewpoint they don't get rendered. angles determine the blocks X positioning, distance between the block and player determines their Y positioning. illusion of perspective (smaller objects farther away, bigger objects closer) is done through scaling down block spites using their Y position. depth is kind of achieved by modifying the alpha value the same way, farther/higher Y blocks have lower alpha values. from this window it's all rendered real-time, destroyed, re-rendered, and so on to keep everything updated.
It is messy spaghetti code, and stuff from the right-side of screen can bleed over onto the display, sometimes it doesn't work for certain walls, but it's good enough. Might be easier to understand using the minimap version below. "Wall" contains the logic necessary to render a sprite in the correct X/Y position on the left-side of screen. "Wall render" is the rendered sprite, contains the logic necessary for rescaling and modifying its own alpha value.
Rendered screen-only version: http://www.flowlab.io/game/play/880631
Rendered screen + "minimap" version: http://www.flowlab.io/game/play/882080
Just something extra I learned from this is that rendered screens could easily be used for multiplayer games to create splitscreens.