csanyk.com

video games, programming, the internet, and stuff

Ludum Dare 41 results

Ratings have been posted for Ludum Dare 41.  InvadTris received these scores:

Overall: 761st (3.298 average from 54 ratings)
Fun: 537th (3.346 average from 54 ratings)
Innovation: 659th (3.265 average from 53 ratings)
Theme: 554th (3.647 average from 53 ratings)
Graphics: 930th (2.817 average from 54 ratings)
Audio: 755th (1.894 average from 35 ratings)
Humor: 935th (2.162 average from 39 ratings)
Mood: 1002nd (2.689 average from 47 ratings)

The rankings may not look very high, but the numbers I earned in Overall, Fun, Innovation, and Theme are all solidly above a 3, which I am proud of.  I think I might have done even better in the ratings had I completed the project in the window of the jam weekend.  Due to my schedule, I was only able to put in 17 hours during the Jam, and submitted a game after deadline, which was playable but lacked considerable polish that I added over the next week+.   A lot of people rated the 1.0 release, which according to the rules is proper, but I can infer from the score I received in the Audio category that at least some reviewers rated one of the later builds.

InvadTris 1.0

InvadTris 1.0

InvadTris 1.8

InvadTris 1.8

Considering I wasn’t necessarily planning on completing anything more than a design document this time around, I think this is more than OK.  I received ratings from as many as 54 peers this time, which I think might be a record.

Importantly, I took away from the weekend a renewed enthusiasm for game development and took great joy in the work.  This project was very fun to work on, and progress was steady and came more easily than in many of my other Ludum Dare projects.

I have still more planned for InvadTris, and will continue to develop it in the days ahead.

How I lost 3.5 hrs of sleep last night by being dumb

I use GitHub, like a lot of developers, to handle my project files.

I like GitHub a lot.  It’s great. I don’t understand it as well as I’d like, but what I do know and use, works very well, and is very useful.

I don’t really know git, the command-line program that GitHub is based around, which does all the version control magic you need. Because I’m a dummy, I only use GitHub Desktop for Windows.  But that’s fine; it does exactly what I need it to do, I haven’t hit any limits with it as yet, and I’m able to understand/figure out its interface to do what I need.  That’s the whole point of a tool. Who has time to read all the docs and learn the muscle memory to type all the commands manually, am I right?

But I say this because I need to make it clear, I’m not entirely clear where git ends and GitHub begins. I’d been hearing about git and how good it is for, I don’t know, years and years, before I finally started using it. About a year ago, I grew tired of all the obstacles that were preventing me from just sitting down and forcing myself to learn to use git from the command line, and decided to just jump in and use GitHub rather than figure out how to set up git on my own server and then do stuff with it via shell commands. I was still worried about doing something that made me look dumb in public, so I didn’t use my GitHub account much at first, until I grew tired of that fear blocking me, and decided to just spend the $7/mo on an account so I could set up private repositories and enjoy the ability to screw things up as much as I dared, without it being on full public view.

This was when I started to make serious progress with getting into GitHub. And I found that with GitHub Desktop, the Windows app that wraps a bunch of command line magic that I wish I knew, but don’t, was amazing. It took a little bit of trial and error to learn the ins and outs, and a very small number of minutes watching tutorial videos.  But that was what was so great about it.  If I had a question, I could search GitHub’s help, and most often I’d find a video that was, at most, about 2 minutes long, that answered my exact question. Most IT tutorial videos feel like they need to spend 20-30 minutes going through some process, and it’s too much to follow, but these bite sized videos were just what I needed.

Anyway, now that my background is explained, I’ll talk about something that happened last night.

I have a project that I’m managing in GitHub, my Ludum Dare 41 project, InvadTris. I had just spent the evening going over the project and made a few minor edits, which I checked in.  As I did my commit, GitHub Desktop notified me that it had an upgrade available.  Without thinking, I clicked the button to install the update.  It installed in a few seconds, and then I noticed that while I’d submitted the commit, I hadn’t yet synced it with GitHub.  So I went to do that, when GitHub prompted me to log in.  I entered my credentials, and authentication failed.  GitHub Desktop showed me as being logged in (I guess locally?) but authentication to github.com through GitHub Desktop was failing.

I tried logging out, then logging back in, as the interface recommended, but this did not fix the problem. I logged out of GitHub in my browser, and logged back in, just to verify that I did indeed have valid credentials, and this was successful.  I wasn’t sure what else it could be, so I tried removing my SSH key in my GitHub account settings, but this didn’t help, either.

I figured I’d open up a ticket with GitHub Support, and they’d give me a solution, so I started writing up a ticket.  I went to check the version number of GitHub Desktop, and in the About screen it told me that I was on the last version of this particular tool, which was no longer actively being developed.  I learned also from this that there was a new GitHub Desktop, and downloaded and installed that.

The New GitHub Desktop had no problem authenticating me.  Problem solved, right? Well, almost.  For some reason, the new GitHub Desktop didn’t pick up any of the local repositories that my old GitHub Desktop knew about.  So, to complete syncing, I’d need to manually add my repository, then I could sync.  Later, I could introduce all my other local repositories into the new Desktop, and then I could just use the new version from now on.

Well, for some reason, the New GitHub and the Old GitHub didn’t seem to agree with each other on the status of my local repositories.  This is where intricate understanding of how git and GitHub works would have come in very handy, but I didn’t have that.  All I had was Old GitHub telling me that I had a recent commit locally that was pending sync, and New GitHub telling me that it saw no changes, so had nothing to commit, nor anything to sync.

WTF? I still don’t know. I quickly went through some of my other local repositories, and found that they also disagreed with their check-in/sync status according to Old GitHub as well.

Fortunately, I eventually discovered that I could simply roll back the update to the old GitHub Desktop, and once I had done so, it was once again able to authenticate me to GitHub’s server, and I was then able to sync up my local files with the server.

I understand that the whole point of git is to have de-centralized version control, but I still think of the repository on GitHub’s server as The Repository.  The files there are backed up, are offsite relative to me, and I feel comfortable knowing that they exist somewhere else in case of disaster.

Now that I have everything synced with the server, I can switch over to using the New GitHub Desktop, and life will be good.  If my other repositories still have disagreements between the Old and New GitHub Desktop, I can resolve them, and it won’t be a problem.

In all, this took me about 3 and a half hours to resolve, from the time that I applied the bad update to Old GitHub, to restoring Old GitHub to the version that works and completing my sync.  Unfortunately, I was up half the night doing this.  I was in a panic, and whenever I have problems with a computer that means I am at risk of losing data, I just can’t sleep until the problem is solved.

So, my takeaway from this experience is that the primary cause of the problem was a failure of the tool, but it was compounded by my stupidly not completing my check-in/sync work before applying an update.  In recent years, software quality has gotten so good across the board that I’ve grown accustomed to upgrades going without a hitch.  I can install Chrome upgrades, relaunch, and it’ll remember all my preferences, all my tabs and sessions, and it’s not even a concern at all anymore. 10, 15 years ago, when I’d upgrade my web browser, it was this big ordeal of bookmarking any open tabs that I wanted to return to, checking all my extensions for compatibility with the new version, installing the upgrade, re-checking all my configuration customization, etc.  Such a pain.  It’s good to remember how much better things have gotten over the years.  This time, though, it bit me.  If I’d simply completed the work I was doing before applying the upgrade, I could have went to be a lot earlier, and worried about the authentication bug later.  So, that’s a good lesson to remember.

Also, as this article shows, I don’t mind being dumb in public after the fact, if once I’ve learned from the mistake, I can share what I’ve learned.  I still don’t know git, or GitHub as well as I’d like, and I’m sure I’ll never have perfect, full understanding of either, but they’re very good tools, and I’m glad I can use them well enough.

InvadTris: A Ludum Dare 41 Game

Over the weekend, I participated in Ludum Dare 41. The theme for this Ludum Dare was “Combine 2 incompatible genres”.  The game I produced, InvadTris, is a mashup of Space Invaders and Tetris, combining the static shooter with a block puzzle game.  I’m very happy with it, and am continuing to develop it. It’s already a lot of fun to play.InvadTris

Play and Rate InvadTris

Post-mortem article

 

Site update

I’ve done a little re-design to the menu for the site. My goal was to make older content more visible to visitors, and to make my projects more visible. I’m not done with this yet, but the first two menus (Game Development Tutorials, Projects) now have sub-entries.  Some of the entries under Projects all leave the site to go directly to where the project page is found on the GameMaker Marketplace, or elsewhere.  This seems to work ok in a desktop browser, however, on mobile or narrow screens the sub-menu entries don’t seem to be visible.  I will have to see what I can do about that. Please report if you have a problem navigating the site.

AtariBox/VCS smells like vapor, poop

Today, UK news source The Register published an article on the new Atari VCS, formerly known as the AtariBox.  I refuse to call it the VCS, because that name is already in use, so I’ll just stick to calling it the AtariBox, to avoid confusion.

http://www.theregister.co.uk/2018/03/22/atari_lempty_box/

I love the URL for the story.  “Atari Lempty Box” has such a nice ring to it. Like a French existentialist “L’empty Box” that smokes cigarettes in Parisian cafes, complaining bitterly about the meaninglessness of life.

Lol. OK. Here’s CNet’s slightly more forgiving coverage.

The Atari fan communities that I follow on Atari Age and Facebook have been roasting this system for months. There’s so much to signal that this is going to be a disaster.  The biggest is the lack of any hard information about hardware specs, developers, games, capabilities, etc.  What has been announced is either vague or very uninspiring.

And now this article.  After months of feeble, empty pre-launch hype, and an aborted attempt at a crowdfunded pre-order, “Atari” shows up at GDC 2018 with an inert piece of plastic shaped like their new console, and no new information.  The CNet article at least explains why — according to Atari, they couldn’t agree on the controller, and ended up rethinking the whole project, which is why they canceled their crowdfunding campaign last year, and why they still don’t have a lot to show for themselves yet.  But that’s still not a very good sign.

Putting aside the obvious con job that this is turning out to be, let’s look at why AtariBox is such a bad idea. Let’s take a look at AtariBox’s selling points:

  1. OMG the case! It has real wood grain! An Atari Logo! And lights!

    By far, the biggest selling point that Atari have presented was the attractive design of the case.  It looks nice, I’ll give it that.

    But that’s it. It has an Atari logo on it, and real wood grain.  I’m pretty sure the original Atari used fake wood grain. The hardware inside the case is what matters, though, and we still know nothing about that, other than some very vague mention that it’s going to be AMD-based.

    It looks like an original woodgrain Atari 2600 was crossbred with an  old cable TV channel selector boxes they used to have in the 80s.
    Atari 2600

    +

    Image result for 80s cable tv box

    =

    AtariBox
    I’ll grant it does look nice.  But, I don’t really care that it looks nice.  When I play a game console, I’m looking at the screen, not the case. I play the console for the games it can run, not for its brand. A game company creates a good brand by consistently creating great games.

    Focus on the games. AtariBox has revealed almost nothing about the games it will run. Over a year of hyping the new console. That’s troubling.

    We’re teased that they’re talking to developers about creating new games based on classic Atari IP.  We’re told that AtariBox will run hundreds of “old games”.  We’re told it will run “new games” too.  We’re told it will cost ~$300, so we don’t expect it to be capable of running cutting edge games, at least not at high framerates with all the bells and whistles.

  2. It runs Linux!

    Nothing against Linux, I love open source software. It’s a good choice. But so what? In 2018, anything can run Linux. It’s not a big deal.

    The real selling point of a game console isn’t the OS, it’s the Games.

    IT’S THE GAMES, STUPID.

    A nice Atari-themed desktop environment would be cool, but inherently whatever they build to run on Linux could be run on any other hardware running a build of Linux compiled for that hardware. Thanks to the GPL, Atari is required to make available the source code for this Linux build.

    Like, I could take a commodity AMD PC, slap AtariBox’s Linux distro on it, and then I could run the same software on it.

    But perhaps they’ll keep their applications that run on top of the Linux layer proprietary. (Of course they will, who am I kidding?)

    In that case, what do I care that they made use of some open source stuff? As an open source proponent, I like when open source propagates and begets more open source. Open source being leveraged as a platform from which unfree software is sold isn’t exciting if you’re attracted to the openness aspect of the system.

  3. It streams video as well as plays video games!

    Yeah? So does my TV. So does my phone. So does my car. So does everything.

    This is 2018. Streaming video over the internet is not amazing anymore, it’s basic. And just like how every home appliance in the 1980s had to have a digital clock, which no one cared about, because they already had a dozen appliances that all had digital clocks built into them, not including it would be weird because everything has it.

    But do you need to buy another thing that streams video?

    No, you don’t.

  4. It plays old games AND new games!

    Old games:

    I like old games. I’m glad new devices can play old games. If you didn’t have that, old games would die off. So I’m glad there are new devices that can play old games.

    But here’s the thing: This is another solved problem. We have Stella. In fact, it’s pretty much guaranteed that the old games that you can play on an AtariBox will be played through Stella. After all, why would they bother to develop a competing system to run Atari games, when Stella is stable, mature, open source, and amazing?

    It’s remotely conceivable that rather than emulating the Atari 2600 in software, they could have their hardware include an FPGA implementation of the Atari 2600 hardware, which would be pretty cool, since it would be that much closer to the original hardware, and could perhaps do things that Stella can’t do. But I can’t think of anything that Stella can’t do. I’m sure Stella must not be 100% perfect, because nothing is, but I have been using it since at least 1996, so 22 years, and it was pretty damn good even back then, and I couldn’t tell you something that I wished it did, but doesn’t do as well as I want it to. Granted, I’m not a hard core user who deeply groks the hardware it emulates and can discern imperceptible differences between original hardware vs. emulator. It’s possible that there’s something Stella can’t do, or can’t do well, that would make an FPGA Atari worth it.

    But it’s probably useless to speculate about it, because it’s all but given that the AtariBox isn’t going to be an FPGA system.

    Even if it was, AtariBox almost certainly won’t be selling you every ROM ever released. No single entity, not even Atari, owns the IP rights to the entire Atari 2600 library. At best, they’ll be offering a good chunk of the total library. And granted, out of the 700+ titles developed for the Atari 2600, a huge proportion of them are not good enough that anyone is going to miss them. Still, the entire library is under a megabyte. So what the hell, you might as well include everything.

    But this is where “abandonware” (software “piracy” of “dead” systems) shines.

    (Of course, Atari never died, if people never stopped playing it, did it?)

    But it did exit the market, and that’s what I mean by a “dead system”. Even notwithstanding a brilliant homebrew community continuing to publish new titles for the system, I still think it’s reasonable to consider the Atari 2600 dead, and not just dead, but long dead.

    Once it was no longer viable to sell in the mass retail market and sustain a company, if our copyright laws were just, old obsolete games should have been ceded to the public domain, say abandonware proponents.

    Of course, legally, that never happened.

    And so, year after year, we see various attempts at re-incarnating Atari’s classic library of games. This never really stopped happening.  NES killed Atari, but many classic titles of the Atari era have NES ports.  And SNES.  And anthology collections on every generation of game console since then, until now.

    See what I’m getting at?  Why do we need an AtariBox to “bring back” the classics, when this stuff has never gone away?

    But the thing is, these commercial repackagings that we get re-sold again and again, are always inferior to what you can get if you aren’t encumbered by intellectual property laws and can treat 30-40 year old software as having entered into the public domain.  Go to a ROM site, download 700+ Atari 2600 ROMs in one click, unzip, launch Stella.  You’re good to go.

    New games:

    I like new games, too!  But there’s no shortage of platforms to play them on already!  What does AtariBox offer that’s new or different from XBox, Playstation, Switch, PC, Android, iOS?  What could it offer? The company calling itself “Atari” doesn’t have the deep pockets of Microsoft, Sony, Nintendo, Apple, Google.

    Exclusives? Nobody wants to be exclusive on the smallest upstart competitor’s box.  Successful games that people want to play are generally ported to as many platforms as possible.

    Nintendo doesn’t port their first-party titles to Xbox and PlayStation, but that’s because they’re very well established, and well-heeled, and they can afford to.  That’s what it’s like when you never went bankrupt.

    Atari has some very iconic, classic IP, which they could conceivably bring back, but it’s not nearly as attractive as Nintendo’s A-list. Tempest 4000 looks pretty cool, but Tempest is not Mario, Zelda, or Metroid caliber, not even close.

    Various incarnations of Atari already have re-packaged and licensed that IP to anyone and everyone over the last 20+ years.  They could try to create some brand new titles inspired by their old IP, and keep it reserved as exclusive content to help sell their platform. This is probably what I would be most interested in.  Not playing “new games” from a couple years ago, like Skyrim or Hotline Miami on AtariBox, but playing an all-new Pitfall! that looks and feels like the Atari 2600 game, and just has some more to it. Give it to the guy who did Spelunky, maybe.  Let him see what he can do with it. Or maybe bring back David Crane if you can get him, and see what he can come up with now.

    But the thing is, if those games are any good, they would sell far better, wider, and more copies if they were made available on every platform. We learned this a few years ago from Ouya. Ouya courted indie developers, but indies released anywhere and everywhere they could, and in the end no one gave a shit about Ouya.

    The AtariBox hardware is all but certain to be less powerful than the XBox One, PS4, or even the Switch.  So it’s not going to play cutting edge new games, but will play “new-ish” games from 2-5 years ago that we’ve already seen and played through.  Why would we want to buy them again, just to play them on a box with a Fuji logo on it?

As much as I would love for there to be a viable Atari console in 2018, I just don’t see what possible niche they could occupy that would work for them well enough to enable the company to compete in today’s market.

Visual theory of the Z3D Engine

Editor’s note: [I originally wrote this as an Appendix to the documentation for Z3D Engine, but I think it’s interesting enough to deserve a slightly wider audience.]

I have to preface this section by saying that I have no idea what I’m talking about here, but am trying to learn.  I like math, but I didn’t go to school primarily for it, and that was decades ago. I haven’t studied 3D geometry, or optics, or computer graphics in any formal sense. I’m figuring this out more or less by myself, learning as I teach myself.

So if someone who knows more than I do wants to explain this stuff better than I can, I’d love to hear from you. You can send me an email at the Contact page, or tweet at me @csanyk, or just comment on this article

Thanks in advance!

I’ve called Z3D Engine a “Fake 3D” engine and “2.5D” engine, because those are fairly vague terms that I don’t have to worry about being right about. Someone asked me what type of view it is, and I couldn’t tell them. That bothered me, so I started reading a bit. I still don’t really know for sure.

or·tho·graph·ic pro·jec·tion
ˌôrTHəˈɡrafik prəˈjekSHən/
noun

  1. a method of projection in which an object is depicted or a surface mapped using parallel lines to project its shape onto a plane.
    • a drawing or map made using orthographic projection.

I think this is sortof close to what Z3D is… maybe.  What I can tell you about Z3D is this:  You can see the full front side and the full top side of (most) objects.  These do not foreshorten.

fore·short·en
fôrˈSHôrtn,fərˈSHôrtn/
verb
gerund or present participle: foreshortening

  1. portray or show (an object or view) as closer than it is or as having less depth or distance, as an effect of perspective or the angle of vision.
  2. “seen from the road, the mountain is greatly foreshortened”

The blue rectangle that represents the “player” in the demo is intended to show the player as a side view only, with no pixels in the sprite representing the top surface of the player. This is because I’m intending Z3D to be used for games drawn in a visual style similar to the top-down Legend of Zelda games, and in those games, no matter which way Link is facing, you can only see pixels in his sprite that represent his side, and nothing that represents the top of him, even though you’re viewing most of the rest of the terrain in the room from this weird view where you can see both the top and side of things like blocks and chests, and for other things like bushes you can only see the side.

Things in Z3D do not appear to get smaller as they recede into the background, or get bigger as they get closer to the foreground.  As well, the tops of objects (that have tops), the top is drawn 1 visual pixel “deep” (in the Y-dimension) for every pixel of distance.

This doesn’t look correct, strictly speaking; if you’re looking for “correct” visuals this engine likely isn’t for you.  But it is visually easy to understand for the player, and it is very simple.

What I’m doing in Z3D Engine is showing the top of everything (that has a top) as though you’re looking at it’s top from a vantage point that is exactly perpendicular to the top, while at the same time you’re also seeing the side of everything as though you’re looking at the side from a vantage point that is exactly perpendicular to the side.  This is an impossible perspective in real life, but it works in 2D graphics that are trying to create a sort of “fake” 3D look, which is what Z3D does.

Imagine you’re looking at this cube:

Cube

At most, assuming the cube is opaque, you can see only three faces of the cube from any given vantage point outside the cube; the other three faces are occluded on the other side of the cube.

Cube with occluded faces

(That image above is properly called Isometric, by the way. Z3D is not isometric).

If you were looking at the cube from a vantage point where you were perpendicular to one of the faces, you could only see that one face, and it would look like a square:

Square

(Since the faces of this cube are all nondescript, we can’t tell if we’re looking at the side or the top of the cube.)

Now, if it were possible to be at a vantage point that is exactly perpendicular to the both the side and the top of the cube simultaneously, the cube would look like this:

Flattened Bi-perspective cube

This is weird and wrong, but yet it is easy to understand, and it turns out that it is also very easy to compute the position and movement along 3 dimensional axes if you allow this wrong way of drawing.  This is view is (or perhaps is similar to) a method of visualization known as a oblique projection.

More properly, if you were positioned at a vantage point somewhere between the two points that are perfectly perpendicular to the top and side faces, the cube would look like this:

Cube in perspective

Here, obviously, we are looking at the cube mostly from the side, but our eye is slightly above, so we can see the top of the cube as well.  But notice, since we are not viewing the top face of the cube from a perpendicular vantage point, it does not appear to be a square any longer — it foreshortens, so that the far end of the top of the cube appears narrower than the closer end.

This is perhaps obvious, because we’re using to seeing it, because we see it every day, because that’s what real life looks like.  But it’s because we see this every day that we take it for granted, and when we have to explicitly understand what’s going on visually with geometry, we have to unpack a lot of assumptions and intuitions that we don’t normally think consciously about.

If we were to put our eye at the exact middle point between the points that are perpendicular to the side face and the top face, the cube should look to us like this:

Cube at 45°

Notice that both the bottom of the side face and the far edge of the top face are foreshortened due to perspective.

This is how they “should” look in a “correct” 3D graphics system, but Z3D “cheats” to show both the side and top faces without doing any foreshortening, which means that it can draw an instance as it moves through any of the three dimensions using extremely simple math.

Visually moving 1 pixel left or right is always done at a hspeed of -1 or 1, regardless of whether the object is near (at a high y position) or far away (at a low y position).  Likewise, moving near or far is also always done at a rate of one distance pixel per apparent visual pixel. And moving up and down in the z-dimension is also always done at a rate of 1 distance pixel per apparent visual pixel.

If we wanted to draw more convincingly realistic 3D graphics, we need to understand what’s going on with the eye, with perspective, and things at a distance.

Eye viewing the cube at a 45° angle

The same object viewed in Z3D’s perspective is something like this:

Eye looking at Z3D rendering

(We’ve omitted the occluded faces on the back end of the cube relative to the viewer, for simplicity.)

These two “apparent” perspectives are combined at the point where the player’s real eye is, resulting in something like this fake-3D perspective:

Z3D Flattened Orthographic bi-perspective rendering

So, in conclusion I’m not 100% sure that my terminology is correct, but I think we can call this perspective “flattened orthographic bi-perspective” or perhaps “oblique projection”.

(From this, we can begin to see how a corrected view might be possible, using trigonometry to calculate the amount of foreshortening/skew a given position in the Z3D space would need in order to appear correct for a single-POV perspective.  But this is something well beyond what I am planning to do with the engine; if you wanted this, you would be far better off creating your game with a real 3D engine.)

It gets weirder when you realize that for certain objects, such as the player, we’re going to draw only the side view, meaning that the player will be drawn a flat 2D representation in a fake 3D space.  Yet the player’s “footprint” collision box will likely have some y-dimension height to it.

AtariBox rebrands itself to “Atari VCS” in an apparent attempt to sew confusion^H^H^H^H^H^H^H^H^H^H^H^H^H reboot brand

Ataribox is now Atari VCS, preorder date revealed soon

In a move that endears me to the new gaming console not the slightest bit, Atari has announced that they are re-naming their upcoming AtariBox console to the already-taken name, “Atari VCS”.  Henceforth, people who want to search for the 1977 Atari VCS, later renamed the Atari 2600, will have to wade through hits for the modern AtariBox-Atari VCS that will be released sometime in 2018 (maybe). And vice versa.

That won’t be completely annoying to fans of either console.

“Null Room” hidden in Superman (Atari, 1979)

Atari gamer Marc Gallo has found a secret hidden Null Room in the game Superman (Atari, 1979). Accessed via direct manipulation of memory addresses in emulation, the room does not appear to be accessible through normal gameplay.

I believe this “room” is really just a memory location intended to store objects when they are off-screen, which can be displayed as a “room” in the game, but isn’t meant to be.

It’s interesting to me since I spent considerable time playing this game, and wrote an article some time ago, about the central role that the map and movement plays in the design of the game.

Why I’m “meh” on the SpaceX Falcon launch

This image could have been produced by MTV 35 years ago, for a lot cheaper.

This image could have been produced by MTV 35 years ago, for a lot cheaper.

I’m very meh on the success of the SpaceX Falcon launch yesterday.

I mean, I guess it’s good that someone still gives a shit about escaping Earth orbit enough to actually do it. It’s a wonder more of us don’t, considering how things are here. We need to be more ambitious about space.

And it’s nice to know that we still have the capability to launch a decent rocket. The Falcon heavy lift vehicle test’s success is a good thing. And I guess since it was just a test launch, maybe that’s why they chose to lift a Tesla Roadster instead of something actually useful, putting at risk billions of dollars in R&D for something serious.

But it’s incredibly sad to me that it’s a billionaire taking a joyride, publicity stunt putting one of his car company’s cars up there, instead of science gear, or habitation infrastructure, or something industrial. The money that put that car up there could have saved thousands of our frankly worthless lives were it spent on the right stuff.

I said as much when Guy Laliberte from Cirque du Soleil went up to squirt water around his capsule and play with it in zero G, too, and everyone thought I was harsh and wrong.

Well, you’re wrong. Shit down here is serious and it’s seriously broken.

Time for billionaires to fix some of it, since they’re responsible for so much of it.

z3d Engine for GameMaker Studio

z3d is a fake-3d engine designed for simplicity, efficiency, performance, and ease of use. Full documentation + demo included.

In a 2D GameMaker room, x and y coordinates are used for positions in the 2D space. 3D requires a third variable for the third dimension, z. In the z3d engine, x and y are used to represent the “floor” plane as viewed from a top-down perspective, from a forced perspective that gives the viewer a full view of one side and top of objects, while z is used for altitude.

GameMaker Marketplace

Full Documentation

csanyk.com © 2016
%d bloggers like this: