csanyk.com

video games, programming, the internet, and stuff

AdapDesk kickstarter melts down

Last April, I backed a kickstarter for AdapDesk, a portable lap desk ideal for use with a laptop computer in bed or seated in a chair without a table. It was pricey, at $125, but looked like it was so well designed that it would be worth the money to have a quality lap desk.

AdapDesk

The kickstarter was successfully funded and my payment went through on May 13. The fundraising part of the kickstarter was very successful, raising several hundred thousand dollars against an original goal of $15,000. A short while later, I was sent a survey asking if I would like to order any extra accessories, and so I sent them another $26 for a cup holder and mouse pad attachment.

Delivery was originally supposed to happen, I believe, in September. This slipped to December, then to mid-January.

Two days ago, on 1/15/18, the AdapDesk team made an announcement. I was expecting to hear that they had shipped, or that they had to delay yet again. Instead, they announced that they have run out of money, and need more in order to complete their obligations to backers. Asking for $55 per desk ordered for air shipping to USA, this is 44% over the original cost (37% if you count the original shipping).

Overruns and lateness are very common with kickstarter projects. I don’t have statistics, but you hear about them enough to know that they happen. And certainly there is always greater risk when you back a project rather than purchase a product. I can tolerate lateness — better to have a product that is late but correct and good than to have something that has problems but is on time. And I have been fortunate enough not to be burned very often by kickstarters that fail to deliver entirely. Although, certainly, that’s part of the risk of backing a project.

This is a bit different situation. The project team want to complete their work and deliver, but they say they need more money to accomplish this. Giving them money is throwing good money after bad. There’s no way that they can guarantee that they will deliver with extra money — maybe they can, maybe not, but in spite of their assurances it’s not guaranteed.

Backers are upset about not getting their reward, and about being asked to kick in even more money to (maybe) get what they paid for. Understandably, and justifiably upset.

Some have been more understanding and are actually willing to put in the additional money. Others are upset, but still want their AdapDesk and will put the money in if they must. A few are disabled/bedridden and don’t have more money, but really needed their desk. But most of us are angry and want either our desk at no additional cost, or our money back.

Both are unlikely, if the AdapDesk team is out of money, they can’t complete the project and they have no money left to refund dissatisfied backers.

So there’s talk about legal obligations and criminalizing the project’s failure by calling it fraud.

Bad business isn’t necessarily fraud. Sometimes things don’t go well and a business fails, declares bankruptcy, and people don’t get what they’re owed. That’s life.

It’s interesting to see how different backers characterize their relationship to the project.

Some backers consider themselves investors. This is false. Backers do not own shares of the company, or of the project.

Most feel that they purchased a product. Even this is somewhat debatable. Backers were promised a reward for backing the project. In this case the “reward” is the product that is the purpose of the project to create. Thus, the “reward for backing the project” closely resembles pre-ordering a product before it is produced. Arguably, it is pre-ordering a product. But technically, backers contributed money to fund the project to produce products, and their reward for backing the project was to receive one of the produced products.

Whether AdapDesk failed to fulfill orders, or failed to reward backers of a project, the results are the same, and the AdapDesk team has failed.

When a project fails due to cost overruns or other reasons, backers lose out, much like investors in a failed business lose money on a bad investment. This is a risk of crowdfunding. The AdapDesk team has offered to complete the project if they receive additional funds, but there’s no way of telling for sure that they will be able to do so.

There’s probably not much recourse at this point for backers who are unwilling to contribute further funds and just want either their reward or their money back. Credit card charge disputes may be the only way to recover money, but whether those will succeed or not remains to be seen.

Update: AdapDesk’s page on Facebook has been taken down. AdapDesk also ran a crowdfunding campaign on IndieGoGo, which has not updated its backers with information consistent with the messaging on Kickstarter, and people are beginning to ask questions. There’s a product listing on Amazon that looks like it has delivered, as there are reviews. Of course, reviews could be faked. Their web site appears to be just a rotating image gallery with a link to the kickstarter page. It’s hard to say still, but the inconsistent information between kickstarter and indigogo is suspicious. And the lack of information on their homepage, combined with their Facebook page being offline isn’t reassuring, either.

Update 2: As of 1/21, AdapDesk has updated its IndieGoGo campaign with the same information and ask of additional money from its backers. There is now a Facebook group for AdapDesk customers to connect with each other and talk about any developments arising out of this, including possible legal actions.

Update 3: Backers who paid the additional money to AdapDesk are starting to report that they have received their orders. In at least one case that I’m aware of, a backer only received a partial order: 1 out of 2 desks, and no storage bag. It’s good to see that AdapDesk are fulfilling orders, as there was a significant number of backers who feared that they would lose their money if they paid the additional amount. While it’s obviously an unfortunate disappointment that the project delays and overruns resulted in so many dissatisfied backers, at least we know now that the project wasn’t a scam.

Update 4: Today some kickstarter backers who did not pay the additional money to AdapDesk have reported receiving their desks! It remains to be seen whether everyone will get their desk; I have not yet received mine. I’m genuinely happy for anyone who receives their reward for backing the project, but I can only imagine how annoyed those who gave AdapDesk the additional money when asked must be feeling to see those who did not pay up getting theirs for the original pledge.

Update 5: On 2/20/2018, I received my AdapDesk backer reward. This was unexpected as I did not pay the additional funding that AdapDesk said they needed in order to complete the project. AdapDesk promised to deliver to all backers who paid them the additional money, but never said that they would deliver to backers who did not.

Obviously this was to incentivize the additional round of funding, because who would have paid more if they didn’t have to? But it was strongly implied that they couldn’t complete the project without the additional money they were asking for. I’m not aware how it is that they were able to ship my desk without me giving them the additional money, but I’m happy that I received what I was originally promised for the funds I contributed. Many backers are still waiting on theirs, including those who paid the additional money.

Reviewing the AdapDesk

Overall, I’m happy to have received something, and am not one to dwell on the poor communications and delays, although they did sour the experience.

I was expecting to receive an AdapDesk Standard, but received an AdapDesk Fully Foldable instead (it has a hinge in the center of the desktop that allows the desk to be folded small enough to fit into a backpack).

I didn’t need or want this, as I don’t plan on travelling with my desk, and wanted it for use around the house.  The center hinge is locked into place by two rotating knobs. I was concerned that these knobs would bump into my legs when using the desk, but the desk is tall enough that this doesn’t seem to be an issue. However, the legs are not able to fold up when the knobs are in the locked position — they hit the knobs, preventing the legs from fully folding with the desktop locked open.

In use, the desk is well designed and functional. The biggest disadvantage is that the cup holder doesn’t have a bottom. It’s just a hole.  If you use a cup with a wide mouth and a narrow enough base, it will sit in the hole and not fall through.  If you use the desk on a surface like a floor or table, the cup can rest on that surface, and be held upright by the cup holder. But the cupholder would be a lot better if it had a bottom that could actually hold a container such as a soda can.

When breaking down the desk to put it away, you have to remove the accessory trays from their mounting points on the desk’s legs, and remove the plastic inserts for the slots on the desktop that are intended for holding pencils, pens, or small thin devices like a phone or calculator.  You also have to remove the center lip that attaches to the desk to prevent things from rolling off it when it is angled. (I wish there was a better name for this, I’ll have to refer to the instruction guide later and see what it’s called.)  So folding it up to put away takes a lot of steps and you can’t simply fold up the legs and leave the rest of it as-is.  It would be a lot nicer if you could do this.

Since I had given up on receiving my desk, I had also gone out a couple weeks prior and bought a different desk, which I think is nearly as nice.  It only cost around $40. it’s not perfect, either, but if I had to compare it to AdapDesk at $125, it’s certainly a better value, and of equal quality.

 

How are my kickstarters doing?

I thought it was about time I took a look back at the various kickstarter projects I’ve backed, and see how they’re doing. Over the last few years, I’ve heard so many negative stories about failed crowdfunding projects, tales of fraud and angry and disappointed backers, that I’d come to feel somewhat negatively about crowdfunding. But really, I think the projects I’ve chosen to back have done pretty well. Not all of them have been successes, but the rate of failure is less than my emotional “feel” for the rate of failure lead me to believe. And of the successful projects, quite a number of them have ongoing life beyond shipping the backer rewards. I feel good about this.

Here then is a list of every Kickstarter I’ve backed, and what happened with it.

Chip Maestero – An NES MIDI Synthesizer Cartridge – Delivered

This was the first project I ever backed on Kickstarter. It took much longer than expected to deliver. I was not surprised by this, and it didn’t bother me. I just waited patiently, and the developer came through. It’s really cool to have MIDI output capability to enable using the NES as a musical instrument.

The Jason Scott Documentary Three Pack – Still in process

This is the oldest kickstarter that I’m still waiting on, but it’s hardly surprising. Producing a documentary film takes a lot of time. Jason Scott works very hard on many different projects. Last I heard, he had to drop the Tape documentary for lack of content, but was working on editing as of last June. Since then, Jason has had a heart attack, and is currently producing a weekly podcast in an effort to pay down some financial debt, which I am a backer of. I’m confident the documentaries will be finished and released. From my experience, Jason is very scrupulous and hard working, and wants to release a first-rate effort, so I’m being patient and looking forward to viewing them when they are ready.

Code Hero: A Game that Teaches You To Make Games – Failed

This project ran out of money and went bust. Oh well. $13.37 well spent anyway.

Spriter – Delivered

I backed Spriter hoping that it would reach its stretch goal to fund development of GameMaker integration. GameMaker ended up using a similar technology, called Spine, for sprite rigging. To date, I still haven’t explored this feature, because my projects tend to be smaller and simpler than call for using Spine or Spriter, and I tend to focus more on programming than on graphic assets. I am not sure whether it has or not, because I haven’t used Spriter. But I’m glad it exists, and I’m glad that I funded it. Even after the Kickstarter project was delivered complete, it is still being developed.

Light Table – Delivered

Light Table was a fantastic idea for an IDE: Give the programmer immediate results, shrinking the feedback loop to zero, enabling instant iteration, and a more intuitive experience for programming stuff. I love the idea of seeing your code instantly interpreted and running, and not having to compile and wait. Light Table was completed, released, and is still being developed and supported.

Atari 2600 Star Castle – Delivered

This project was executed particularly well, and my copy of Star Castle was delivered within a reasonable amount of time. I don’t think it was strictly speaking on time, but it wasn’t long overdue, either, and the project communicated status updates in a timely fashion that helped to manage expectations.

Beautiful Vim Cheat-Sheet Poster – Delivered

Max is a friend of mine, and his little project exceeded his goal considerably. He did a nice job on the poster, and I really like it.

Tropes vs. Women in Video Games – Delivered

Anita Sarkeesian has been a major influencer since launching this kickstarter. The reaction against her project is infamous, and has helped to drive home the point that her work is very much needed. I’m proud to have contributed. Her video series Tropes vs Women in Videogames took a long time to produce, but was very well done. It’s aim to bring her Tropes vs. Women series examining various anti-women tropes in popular culture (movies, tv, etc.) to videogames was and still is much needed.

OUYA: A New Kind of Video Game Console – Delivered

The OUYA is now a dead system, but the project was a success. I received my OUYA and played with it. It was a tv-connected Android-based console, about the size of a baseball, and could play a lot of games. A lot of people used their OUYA as an emulator box, but there were a few good titles developed specifically for it, most notably Towerfall. The thing is, it was under-powered compared to everything else out there, most games are developed and launched for any and all consoles their developers can reach, so there was no exclusive “killer app” content that could compel gamers to buy one, and a lot of people who did complained about the OUYA’s gamepads for feeling cheaply built, and groused about every little thing, the way gamers do. I’m sad it didn’t survive in the market. I really liked the idea of an open console that is friendly to indie developers. Unfortunately the business model wasn’t successful, and the market didn’t appreciate it at all. I consider it a success, despite the fact that it couldn’t survive in today’s market, merely making it to market was an incredible accomplishment.

NeuroDreamer sleep mask – Delivered

My reward was shipped and received quickly. I didn’t pre-order the NeuroDreamer mask, but got a copy of Mitch Altman’s trip glasses, which I’ve used a few times. They work by using flashing LED lights and audio tones to induce an altered brain state, akin to meditation, or perhaps as a meditation aid.

SPORTSFRIENDS featuring Johann Sebastian Joust – Delivered

This project took a very long time to deliver, but I did finally get a copy of my Sportsfriends games. The one I most liked, BaraBariBall, was fantastic. I haven’t played the others.

Aaron Swartz Documentary – The Internet’s Own Boy – Delivered

This documentary is fantastic, and I’m proud to have backed it and to have my name in the credits as a backer. Well worth every penny and then some.

Project Maiden – a Zeldalike in Reverse – Delivered

I only backed $1 so didn’t get any reward, but I understand this project was finally delivered, taking quite a long time longer than expected. With creative projects like video games and movies, I am pretty lenient on release dates. I get that doing it right takes time and should not be rushed. I have never actually played this game though, so I have no comment on how good it is.

imitone: Mind to Melody – Delivered

Soon after making goal, I received a license key and access to the software beta. It works, and has been updated frequently. I haven’t used it recently, but it is neat software and still being developed.

The Stupendous Splendiferous ButterUp – Delivered

This shows how serious I am about bagels, I spent I don’t want to remember how much money on some butter knives that were supposed to make spreading cold butter on toast easier. In practice, I find that they don’t work, and were basically a waste of money. They are well made, but the design just doesn’t work well. Cold butter does not press through the holes the way it shows it working in their video. Live and learn.

Beep: A Documentary History of Video Game Music and Sound – Delivered

I received a DVD copy of the documentary, watched it, and enjoyed it. I thought it was well done.

GameMaker Language: An In-Depth Guide – Delivered

I got a copy of Heartbeast’s book. The project was completed within a reasonable amount of time, and he did a great job with it. He also produces tutorial videos on YouTube, and has branched into teaching online courses through udemy.

Joybubbles: The Documentary Film – MIA? In post-production?

I backed this at a level that got my name in the credits of the film. The documentary is currently in post-production, according to the website. However, the kickstarter page hasn’t been updated since 2015, so this one appears to be missing-in-action. I’ve written to the creator to ask what the status of the project is.

Insert Coin: Inside Midway’s ’90s Revolution – In progress

Latest update was posted mid-December, they are still working on the project and are targeting early 2018 for delivery.

AdapDesk: The World’s First Portable Work Station – Late, and at risk of failure

Expected for November, they are a few months late on this one, but were supposedly finally shipping this month.

I can appreciate that mass production isn’t easy. In November, they said that they intended to ship by late December, in December they announced a further delay would push delivery back to mid-January.

It’s January 15, and today they’ve posted a new update on the kickstarter to the effect that they are struggling and nearly out of money. Cost overruns have forced them to ask for more money in order to be able to ship the goods, to the tune of $55+ per customer, depending on where in the world they are. This represents a cost overrun of close to 150% over what they estimated for the project, and I don’t think I would have backed if I knew it was going to cost $55 more than the pitch. It was already a very pricey item at $125, but since it appeared to be very well designed and since it was something I can definitely get a lot of use out of, I thought it was worth it.

Since this is a developing matter as I type this, I’m not at all clear whether I’m going to get my AdapDesk, or a refund, or screwed, and who’s going to fund that additional $55.

In retrospect, it’s pretty clear that manufacturing small runs of a product is very risky and prone to delays and overruns, so backing kickstarter projects like this is obviously a gamble. If they had brought the AdapDesk to market in a more traditional way, and I could have bought one from a store once they were actually manufactured, I think I would have been happier.

Doing things the kickstarter way is more appropriate for raising funds for prototyping a new product, but maybe for experimental products the reward shouldn’t be the actual product — you don’t know whether the prototype will turn out to be any good, maybe it will be great but infeasible to mass produce at a price point you can predict at the pre-funded stage when you’re not even sure how many backers (and therefore orders) you’ll have, or maybe it will suck and not be something worth making more than one of. Maybe it should be something else: stock in the company that designed the product, a t-shirt or sticker that thanks you for your contribution to making the project possible, that sort of thing.

Using Kickstarter to try to create a product that doesn’t exist yet and take pre-orders for it, using the kickstarter “reward” as the means of delivering on an order doesn’t work out well. If you’re very experienced and good at design and manufacture and logistics, then sure, maybe you can do it. But if you’re good at all those things, then you probably didn’t need to use crowdfunding to begin with, and could have used traditional venture capital, business loans, credit, or what have you instead. And if you’re not experienced at those things, chances are good you’re not going to be able to get the credit, loans, or VC, and hey it turns out there’s a reason for that — investors are smart, and know not to throw money on an unproven risk undertaking by someone with not enough track record.

In commerce, getting what you paid for isn’t a “reward”, it’s expected.

Kickstarters often fail to deliver what is expected after successfully making their fundraising goal.

Kickstarters are a way to fund dreams that no one in their right mind would get behind as a business investment opportunity, and crowdfunding works because $20 or $50 isn’t all that much to some people. There are good ideas out there that can be funded by large numbers of people each with a tiny amount of disposable cash that they can just throw away. We understand, well most of us do, that we’re not buying success, we’re buying a chance at success, and that chance is less than 100%.

Since that’s the case, maybe the better way to thank backers is through rewards that aren’t predicated on the success of the project, but on the success of the fundraising. Kickstart a rocket to Mars. Make the reward be a “I backed the rocket to mars” sticker, not a ticket on the Mars rocket with a launch date printed on it.

AdapDesk is a great idea for a product. It turns out that bringing a product to market takes more than a good idea, some money, and a lot of work. It takes a good idea, some money, a lot of work, and then a lot more work, and then some more money. We’re at the point where they need that last bit of “some more money” and they’re out, and their customers are pissed. I hope I still get my AdapDesk, but I hope I don’t have to pay $55 to get it delivered on top of the money I already paid. I certainly won’t give them another penny, let alone $55, without an actual tracking number — and maybe not even then.

Make Professional 2D Games: Godot Engine Online Course – Delivered

I’ve watched some of the videos, and they are well done. I have yet to truly immerse myself in Godot engine, but I am very happy to support an open source 2D game engine of high quality.

Next Gen N64 Controller – In Process, Late

This project from RetroFighters should be shipping soon. Early word is that the controller is very good. Originally these were supposed to be delivered in late 2017, but a month or two delay is forgivable. For $20, a newly designed gamepad for the Nintendo 64 built to high quality standards is very impressive, if that is indeed what they deliver.

Full Quiet – A New Adventure Game for the NES & PC – In Process

Expected delivery date in late 2018, but we know how this goes… waiting and seeing.

NESmaker – Make NES Games. No coding required – Backed

Kickstarter is still in the funding stage. They’ve already hit their goal, so it will be interesting to see how far it goes and how many of their stretch goals they can reach.

NESmaker kickstarter promises every 80’s kid’s dream

NESmaker is a no-coding IDE for creating games for the Nintendo Entertainment System, currently being kickstarted by The New 8-Bit Heroes‘ Joe Granato. When they say you can make a NES game with this toolkit, they mean real NES games, that you can play on actual hardware. This is pretty amazing.

The story behind it is that some NES homebrewers are turning the tools they’ve developed for their own use into a product for anybody to use.

Normally, if you want to program for the NES, you need to learn 6502 Assembler, and get really “close to the metal” — which is not for everyone. With NESmaker, supposedly you won’t need to code at all, although you’ll be limited to creating “adventure games” (think top-down zelda-likes). They are hoping to raise enough money to enable them to create additional modules to enable users to make games in various genres.

Although the developers have been using the tool internally on their own projects for a few years, it needs more polish before it’s ready for general use, so they are running a kickstarter right now to take pre-orders and to raise the necessary funds to complete their project. This includes not only the NESmaker software, but the hardware needed to flash a game pak so you can put your finished game on a cartridge and play it on real hardware.

How cool is that?

A quick and dirty, incomplete and opinionated history of videogames and related technologies

Pre-1977

  • The Dawn of Time.
  • Pinball and other electromechanical games of skill are already popular in arcades, carnivals, and shooting galleries. In addition to pinball tables, there are games that mimic sports like baseball, bowling, race car driving, and military-themed games based on airplanes and submarines (Persiscope, SEGA, 1965). Most of these games are coin-operated, a dime or a quarter giving a credit for play, hence the term “coin-op”. These exist alongside traditional carnival games like dunk tanks, skee ball, whack-a-mole, tests of strength, and throwing games.
  • Willie Higgenbotham’s “Tennis for Two” demo (1958), played on oscilloscope. Often cited as the first “video game”.
  • The Mainframe Age. Games programmed by and played by computer scientists at universities, and scarsely known outside their world.
  • DEC PDP series minicomputer systems (PDP-1, etc.) are sold to large universities and corporations for business and research. Programmers on these systems develop games to run on them almost from the very beginning. They are “mini” compared to older computers, which could occupy an entire building, but still are large enough to occupy several cabinet-sized racks in a room.
  • Colossal Cave Adventure, Hunt the Wumpus, and similar games, mostly text-based played on teletypes and line printers, as most computers lack video displays. Text dominates for a few years, because graphics are incredibly primitive, limited, and expensive, and because keyboards and teletypes are far more common. Today there’s some (mostly) academic debate as to whether a text-only game counts as a “video” game, but it they are clearly computer games, and represent significant branch of the history, particularly in the 70’s and 80’s.
  • Spacewar (1962) developed on the PDP-1, often cited as the first videogame.

1972-1977

  • At home: Magnavox Odyssey, Ralph Baer‘s “Brown Box” more or less invents controllable graphics displayed on a television screen (as opposed to oscilloscope screen or computer display).
  • Atari founded, dawn of coin-op (arcade) games. There were arcades before Atari, with electromechanical shooting and racing games, with electronic lights, bells, and mechanical levers and switches, as well as pinball tables, shooting galleries, and carnival games of skill. But videogames were a quantum leap forward, forever changing the landscape.
  • After first developing a ripoff of Spacewar in an arcade cabinet called Computer Space (which struggles to catch on), Nolan Bushnell rips off Baer’s Brown Box off, creating Pong, and with it, Atari has its first big commercial success.
  • Many pinball companies start producing videogames as well (Bally, Williams, Midway, Gottlieb). It is a natural expansion for them, as they are already familiar with mechanical and electronic gaming and entertainment, with decades of experience and distribution. Joining them are Japanese companies like Taito, Namco, Nintendo, and Sega. Each produces notable successful games, but Atari leads, innovates, and dominates them all in the videogame sector of the market, eventually becoming the fastest growing company in the history of the world, becoming acquired by Warner Communications in 1976 (just prior to the launch of the Atari VCS).
  • Some other minor oddball home games, mostly pong clones and dedicated hardware (plays a single game only, not programmable/reconfigurable).
  • Atari’s first cartridge-based home console, the VCS (later renamed the 2600) released in 1977.

1977-1982

  • Sometimes referred to as “the Golden Age of Video Games” or the Atari Age. The early 8-bit era, dominated by processors like the Zilog Z-80, MOS 6502, and Intel’s 8088/8086.
  • Space Invaders (1978) creates a coin shortage in Japan.
  • First color graphics in a coin-op arcade game, Namco’s Galaxian (1979).
  • Atari dominates in the of the arcade; coin-operated games continued to shift from mechanical or electro-mechanical (eg pinball, etc.) to fully electronic and digital.
  • At home the Atari VCS revolutionized home entertainment, tens of millions of units sold. Largely on the strength of home versions of their top coin-op titles, but many unique titles developed as original games.
  • Warner Bros. acquires Atari in 1977. Warner Bros. struggles with Atari to manage this success, and a uniquely Californian anything-goes business culture. Nolan Bushnell exits Atari. New CEO Ray Kassar takes over, immediately getting to work transforming Atari into a business and destroying much of the original culture (which was not in need of much help when it came to self-destruction, to be honest). Atari becomes the fastest growing company in the history of the world at the time.
  • Also-rans in the early home console market include: Fairchild Channel F (1976), Bally Astrocade/Professional Arcade (1977), Magnavox Odyssey2 (1978), and Vectrex (1982).
  • Later in this era, ColecoVision (1982), Mattel Intellivision (1979) rise to viable competitors to dominant Atari, but fail to unseat the king of console games. Arguably these later consoles belong to a newer generation to the Atari VCS.
  • To stay competitive, Atari releases its new 5200 console, which is hampered by poor controllers, lack of backward compatibility with VCS (an 2600 expansion module was available for the 5200, ColecoVision, and Intellivision, however) and limited library of mostly better fidelity ports of coin-op titles that already existed on the 2600 and other consoles) and soon flounders in the wake of the Crash of 1983. Atari struggles to continue supporting the new and older consoles, the massive install base for the 2600 sustaining the company through the upcoming Crash of ’83.
  • The rise of home computers: the Apple II, Commodore 64, Vic-20, Atari 400/800, the MSX, ZX Spectrum. These systems are all largely incompatible with one another, their hardware and software varying widely, but many successful games are ported to run on all of them.
  • Early personal computer games are dominated by text adventure games from companies like Infocom and Sierra On-Line.
  • Mainframe games still popular in computer labs and universities, and are now being ported to home computers, arcade, and home consoles in various ways. The genre-spawning MUD (multi-user dungeons) the earliest ancestors to Massively Multiplayer Online Role Playing Games, soon followed by Rogue, which spawns the rogue-like genre, including HackNethack.
  • Graphics range from primitive to very primitive, especially on business PCs (monochromeCGA and EGA graphics).

1983-85

  • The Crash. Major downturn in the market as a glut of poor quality games causes a massive drop in demand by consumers. At the time this was misread by many to mean that videogames were a passing fad that had had their day, but really people wanted new, better games.
  • Many companies went out of business or shifted to other industries. Out of the “Big 3” of the day (Atari, Coleco, Mattel), only Atari comes out of the Crash still in the videogames market. The void left by Coleco and Mattel will be filled by Nintendo and Sega in short order. Atari readies its 7800 console, only to shelve it due to Warner’s sale of Atari to Jack Tramiel, owner of Commodore, who sits on the technology until years later Nintendo’s NES takes the market by storm, way too late for it to have a chance of being anything more than an also-ran system. Nintendo flirted with Atari, eyeing a partnership for a US launch of the NES as an Atari branded console, before deciding that they could do just as well if not better on their own (and they did.)
  • On the home computer front, Apple, Commodore, Atari remain strong gaming platforms, with increasing competition from generic IBM PC and PC-compatible clones.
  • Floppy disk-based software gives rise to a subculture devoted to hacking copy protection to violate copyright and distribute free copies of software, known as “warez“. Underground BBS and FTP sites carrying warez and cracks on the rise. Copy protection hackers start “signing” their work with “tech demos” to show off their skill at controlling the hardware, giving rise to the Demoscene, where circumventing copy protection and product activation goes by the wayside, and all focus goes into creating cool technical demos that stretch the limits of hardware to display killer graphical and audio demos, for lulz and bragging rights.

1985-1990

  • The late 8-bit era. The new “Big 3”: the venerable, obsolescent Atari 2600, NES, Sega Master System. Atari, a distant third place by this point with its newly released 7800 home console delayed until it is too late to head off the exploding popularity of the NES, and aging 400/800/1200 line of home computers. The success of the NES effectively buries Atari and wins Nintendo a monopoly in home game consoles.
  • The golden age of the arcade has dulled to a “silver age”; arcade games are still popular, but are receding as many arcade locations go out of business, and the era of arcade games literally everywhere, including outside in front of convenience stores and gas stations, soon phases out.
  • IBM PC clones, Intel x86 and 286 hardware generations. DOS. The early years of the Apple Macintosh, still with only b&w graphics. Pre-WWW internet consisted of things like Bulletin Board Systems (BBSes), IRC chat rooms, FTP sites, Telnet, NNTP (usenet/newsgroups), Gopher sites, etc. mostly limited to people at large universities in technical departments (comp sci, physics, math, engineering, science) and home computer enthusiasts who direct-dialed to specific systems via modem to access a specific system, rather than dial-up ISP for a gateway connection to “the internet”.
  • Virtual Reality first conceptualized, as early flight and military vehicle simulators inspire a concept of “total immersion” (where the subject can’t discern a difference between real experiences and virtual ones) as the holy grail that VR R&D will strive for. Nothing much as far as games technology at this point, but the Atari coin-op’s 1980 3D wireframe tank simulation game Battlezone is their great-granddaddy.

1990-1995

  • Arcades still popular, but to a much lesser degree than the late 70s/early 80s; as coin op cabinets recede from literally everywhere to dedicated businesses, which are increasingly struggling as improving home console technology makes coin-op games less and less appealing value proposition by comparison. Many gamers prefer the lengthier, more complex style of play offered by home games, with puzzles and story lines, and adventure and RPG genres, simulation and strategy games gaining in popularity.
  • In the arcade mega hits Street Fighter II and Mortal Kombat rejuvenate fading arcade market. 2P vs. fighter games and 4P side-scrolling beat-em-ups such as TMNT, X-Men, and The Simpsons very popular in arcades in the early 90s. Driving/sit-down vehicle games and light gun games also remain popular.
  • Movie, comic book IP licenses adaptations to videogame medium begin to increase in popularity, and fidelity to the source material. It’s increasingly common for a movie release to be accompanied by a video game version that closely follows the story, rather than just being about the character.
  • Nintendo GameBoy (1989) ushers in a golden age of handheld gaming, to be followed up in future generations by GameBoy Color (1998), GameBoy Advance (2001), DS (2004 and onward), and ultimately converging back to television-based gaming in 2017 with the Nintendo Switch. Previous portable/handheld electronic games were single-title LED-based hand held games that were barely videogames. Previous generations of handheld electronic games from the early 1890’s were insignificant (Tiger LCD games, Nintendo Game and Watch, Coleco LED based games, etc.) Atari Lynx (1989) and Sega Game Gear (1990) attempt to compete, but Nintendo dominates the market, continuing until the present day, despite efforts from others over the years such as the Nokia N-Gage, Sony PSP and Vita, WonderSwan, etc. Nintendo will dominate handheld gaming for the next 20+ years, challenged only by the rise of iOS and Android smarpthones and tablets in the mid-2000’s.
  • The 16-bit era. SNES, Sega Genesis, NEC TurboGrafx-16 home consoles in the early 90’s, giving way to Nintendo 64 and Sony Playstation in the mid-90’s.
  • Game developer SNK straddles the arcade and home console market with its NEO GEO system (1990), popular in arcades as a modular coin-op platform, but too expensive for most home gamers, with individual game cartridges selling for $350 per title.
  • Nintendo makes an infamously huge business blunder by flirting with Sony to partner in developing a CD-ROM peripheral for the SNES, and then canceling the project after Sony had committed millions in development costs.
  • Betrayed by Nintendo, and committed to its investment, Sony goes alone, and the PlayStation brand is born, creating a massive new rival with deep pockets and very mature technical and manufacturing capability.
  • Just about every hardware maker except Nintendo releases a CD-based console: Philips CD-i (1991), 3DO (a 1993 attempt at licensed standard to be manufactured by various companies), NEC TurboGrafx CD (1990), Sega CD (1991), Sega Saturn (1994). Most of these are commercial failures, enter and exit the market without much fanfare, attempting to compete with Sony PlayStation, but too expensive and too weak game libraries.
  • By the mid-90’s the 16-bit consoles give way to 32-bit consoles and computers. Bus-width captures imagination of marketing, resulting in bizarre, distorted claims about the specs and capabilities of certain systems.
  • Sega’s poorly timed hardware releases keep the company off-kilter as they continue to struggle, dropping to 3rd place in the market. Sega 32x, an expansion module for Sega Genesis, and then Sega Saturn the following year, released in 1994 and 1995, respectively. Sega struggles with the design and timing of launch for its consoles, undermining their success by designing hardware architecture that is difficult to program for, and then too quickly obsolete to recoup investment.
  • Atari’s last gasp, the Jaguar (1993), fails and Atari exits the stage, ceasing to exist as a real company. Sony replaces it; the Big 3 now are Nintendo, Sega, Sony. Fun fact: Atari continued producing Atari 2600 consoles and new games right up until 1992, making it by far the longest-lived home console of all time. Atari sells off intellectual property rights which pass through various companies that attempt to cash in on the early IP by re-releasing collections of classic arcade and home console titles.
  • All the other competitors exit after a few years of insufficient sales attract little third party development, resulting in weak game catalogs.
  • 1995 marked a transition from DOS to Windows 95 and ushered in the home internet era, with ISPs like CompuServe, AOL, and Prodigy bringing dial-up access to the internet to the masses. No longer just a university/academic thing, the internet boom happens. Microsoft’s Internet Explorer web browser integrated and bundled for free with Win95 results in massive anti-trust lawsuit which Microsoft eventually wins despite clear guilt as a harmful monopoly, as a result of shifting political winds after George W. Bush narrowly defeats Al Gore in a highly controversial 2000 presidential election.
  • IBM PC: DOS continues to remain popular with gamers until about the time of Windows 98 or 2000, as it is faster, more stable, and more tune-able for getting the most performance out of the hardware for a time. Communities of hardware enthusiasts come together online to seek the ultimate performance tweaks and secrets for getting the most out of their system, resulting in a sub-culture of “overclockers“, many of whom go on to tech careers or become hackers of one stripe or another.
  • IBM PC: Intel 386, 486, and Pentium, Pentium II, chips rapidly increase the performance and capabilities of PC-compatible hardware. CD-ROM drives started to become standard on PC’s in the mid-90’s, giving rise to larger games with greater multimedia content, pre-rendered video, etc. The CD-ROM game Myst (1993) is a major mover of CD-ROM drives.
  • 1996 is a major milestone year as this is when the internet became popular and started to become used by developers to distribute games. Home dialup services like Prodigy, CompuServe, and America Online bring internet access from computer labs and dorm rooms at university to the middle-class households across America.
  • On the PC, the addition of SoundBlaster soundcards and hardware-accelerated graphics increased the capability of PC games, giving rise to a new emphasis on 3D games, especially 3D first person shooters like Doom and Wolfenstein 3D, Marathon, and Duke Nukem 3D.
  • Microsoft studies Apple’s mouse-driven UI and desktop metaphor, and rips it off, releases Windows, initially a graphical shell that runs on top of DOS and is generally terrible, but runs a vast majority of business applications. The sheer size of the PC install base makes it an attractive market for games, despite widely varying hardware specs from one PC clone manufacturer to another, and from one generation to the next. The difficulty of programming games on such varied, yet standardized hardware means that compared to console and arcade games, PC game are mostly inferior tech-wise. Console technology is so much more consistent in spec that developers can write code much more tightly coupled to the hardware, getting the absolute most out of the systems’ capabilities, resulting in far superior games despite the fact that console hardware generally has slower, smaller hardware specs. It’s how you use it.
  • Apple struggles to survive, hanging in there thanks to very loyal Macintosh user base, and a truly superior user experience (although the underlying OS is still terrible), clinging to 1-5% of the personal computer market, and failing to make inroads against commodity priced generic PC-compatible hardware. Macintosh moves from Motorola 68000-series CPU chips and NuBus architecture to IBM PowerPC based systems. Despite superior technology, they can’t get cheap enough to compete with the generic commodity PC.
  • Start of the Emulation scene, old hardware emulated in software to create a compatibility layer to enable play of software designed for older systems on newer platforms. Emulation scene offshoots ROM hacking and homebrew game development. (Still later in the 20-teens, game hackers will start experimenting with teaching AI to play old, emulated video games, via genetic algorithm and other machine learning techniques).

1996-2000

  • Nintendo tries to leapfrog everyone with it’s 64-bit Nintendo 64 (1996), but finishes 2nd place in sales to Sony’s 32-bit PlayStation. Nintendo sticks with cartridge format in its next-gen system , struggles to retain 3rd party developers as a result. Many 3rd party game development studios move to support Sony PlayStation due to greater storage capability and cheaper manufacturing cost of the CD-ROM format.
  • Nintendo continues to garner a reputation for being a G-rated games company for younger children. This started with the 16-bit console wars between “family-friendly” Nintendo and the older teen/young adult market targeted by Sega, now Sony, and (later) Microsoft.
  • Demise of Sega as a hardware company, their final console the Sega Dreamcast (1999), initially to strong sales, only to be blown out of the market months later by the Sony PlayStation 2 (2000) and the Microsoft XBox (2001).
  • Emergence of the 3D First Person Shooter (FPS) genre, with games like Doom, Quake, Half-Life, and Unreal leading the way. There were 3D first-person games before Doom, but the genre reached maturity and achieved dominance in the gaming market around this time, and has remained one of the most popular types of games ever since.
  • Apple comes as close to exiting the business as it ever will, before Steve Jobs returns from NeXT and reinvents the company with iMac, iPod, iPhone, and iPad over the next decade. iOS platform becomes an important platform for game developers, with the App Store becoming an important distribution vehicle for indies and established studios alike.
  • Macromedia‘s Flash and Shockwave technologies make the web browser an important game platform during this time. Adobe later acquires Macromedia, in 2005.
  • Virtual Reality as a concept enters mainstream public consciousness, but remains very primitive. Nintendo’s early VR console experiment, the Virtual Boy, is a notorious flop. But VR as a concept gets a lot of attention from science fiction writers and Hollywood.
  • Rise of game modding scene, as game software architecture becomes more modular and therefore easier to modify and re-use. Companies develop engines that are used to drive many game releases. Game players get into the act by modifying published games, adding new life and replayability to them by creating new maps, items, in some cases entire new games, and sharing them with the online community.

2000-2006

  • Sony releases its PlayStation 2 console in 2000. Gamers camp outside of stores like Best Buy in the hopes of getting theirs before they sell out.
  • Microsoft XBox (2001) is released as Microsoft attempts to muscle their way into the industry. Microsoft was already strong as a gaming platform due to the popularity of DOS and Win9X as a gaming platform, XBox was a strategic move into creating a standardized hardware platform dedicated to gamers and home entertainment. Bulky and ugly, the initial XBox is essentially a Intel Pentium III PC with nVidia graphics, only with a gamepad as the only input, instead of a keyboard and mouse, running an OS based on Windows but stripped down in order to focus on game software. Macintosh-era AAA game developers Bungie acquired by Microsoft, who steal Halo from Apple fans and put it on the XBox.
  • Nintendo releases its first optical disk system, the Gamecube (2001), almost a decade after their original plans with the SNES fell through.
  • In the face of two competitors with very deep products and more powerful hardware, it holds its own, largely on the strength of Nintendo’s strong appeal as a first-party developer, with games like Super Mario Sunshine, Mario Kartand Metroid Prime.
  • Most internet gaming is done on PC, thanks to the emergence of widespread adoption of broadband internet in homes.
  • HDTV exists during this time but is not widely adopted yet, so consoles of the day do not yet support HDTV resolution. NTSC 480i or 480p are the best resolutions supported in this generation.
  • Consoles are still mostly not internet capable, for practical purposes, although there have been attempts throughout most of the previous generations to try to make use of networking capability in some fashion. (Even back in the early 80’s there were online services for consoles such as Intellivision PlayCable, Nintendo Broadcast System, Satellaview, Sega Mega Modem, and others, but these were too ahead of their time for most consumers to take notice of them, with very limited support for a tiny number of games, in limited markets.) This starts to change with the PS2 and XBox, but online gaming services for PlayStation Network and XBox Live will not come into their own for one more generation.
  • Rise of internet gaming and digital distribution for PC games.
  • Videogames starting to become increasingly respected by academics and critics as a valid medium/topic of academic study.
  • Valve launches its Steam platform in 2003, coinciding with the release of Half-Life 2. Steam will become a major player in marketing and digital distribution in the years ahead. Steam marks the rise of the “software as a service” business model.

2006-2012

  • The HDTV era. Increasing popularity of HDTV enables consoles to move from NTSC TV resolutions to 720p and 1080p.
  • Home consoles: Sony PS3, Microsoft XBox 360, Nintendo Wii. The Wii notably does not support HD resolution, but the PS3 and XBox 360 do.
  • These newer consoles are increasingly internet-integrated, many games rely on the internet to work at all, to validate the license, download updates, connect to multiplayer and social/community servers, etc.
  • Games increasingly go digital for distribution; physical media and brick-n-mortar retail decline.
  • The Wii dares to innovate with motion controls, Sony, MS follow suit half-heartedly with PS3 motion controller and Xbox Kinect. Despite the weaker hardware specs, the Wii outsells both Sony and Microsoft, regaining the market lead.
  • The mobile gaming era. Introduction of iPhone and Android platforms, smartphones and tablets. Rise of touch screen technology creates new input and control opportunities.
  • Rise of indie developers. At first doing direct sales through shareware business models, DIY e-commerce. First Global Game Jam (2009), Ludum Dare (2002, but not widely popular until this period) events happen in the early part of this period, paving the way for countless other game jams to follow.
  • Curated “walled garden” marketplaces (Apple’s App Store, Google Play, Steam, etc.) welcome indie developers, early adopters who initially are rewarded, followed by hordes of also-ran developers who struggle for any attention at all, resulting in “Indypocalypse” by 2015, as shifting business models and reduced barriers to entry create business and marketing challenges that most are unprepared to face and incapable of handling.
  • Rise of Valve’s Steam service, introduced in 2003 with Half-Life 2, as an increasingly important distribution platform for AAA and indie developers alike. Indie studios especially depend on Steam to release, distribute, and market their games. Steam Greenlight program allows indies to reach wider audiences more easily than previously possible.
  • Steve Jobs’ refusal to support Adobe Flash player on iPhone is the harbinger of death for Flash-based web content, although the end won’t come for a good decade and more.
  • Rise of “retro” style games that harken back to the 2D era and simpler styles of games, hook into gamer nostalgia.

2012-current

  • Nintendo Wii U (2012), Sony PS4 (2013), Microsoft XBox One (2013), and Nintendo Switch (2017). Wii U is a short-lived failure for Nintendo, but the lessons learned from it give rise to the hot-selling Switch. HDTV and internet now mainstream and assumed to be standard features of nearly all games. UHD (4K) TVs exist, are already relatively inexpensive, increasingly common, and gaining support from PS4/XBone. Nintendo continues to take a “less is more” strategy, opting for innovative design and popular IP to sell less-expensive hardware of significantly lesser capability.
  • Indypocalypse in full swing. Humble Store introduces the Humble Bundle, pay-what-you-want business model for a bundle of popular indie titles that had already sold well. It is a viral sensation and generates millions in revenue despite the fact that you can get the bundle for free; a “beat the average price paid so far” incentive for an extended bundle does the trick. In the mobile world, a race-to-the-bottom approach brings prices for many games to free or nearly free; revenue streams from in-game advertising, in-game purchase, “DLC” (downloadable content) packs, subscription-based licensing etc. struggle with varying success to replace revenue lost from retail sales of copies of physical media. Steam announces end of Greenlight program in 2017.
  • Nearly 100 yers old, Ralph Baer dies in 2014, marking a point at which videogame history begins shift from a period where everything can still fit within living memory to a phase where history will increasingly consist of what can be recorded and preserved. Historical preservation efforts by Jason Scott at the Internet Archive, increasing scholarly attention by notables such as Ian Bogost, Henry Lowood, and others.
  • Maturing HTML5 technologies finally start to erode Flash marketshare, resulting in Adobe announcing in 2017 the sunset of Flash is in sight, currently slated for 2020. Future of preservation of Flash-based games uncertain.
  • VR and AR (augmented reality) technologies start getting better, but still very niche. Oculus Rift, Microsoft HoloLens, Google Glass, Google Cardboard, etc. Pokemon Go (2016) is an early standout augmented reality title, and briefly one of the most popular videogames of all time.

GMS2 alternative skins

So it’s little over a year since YoYoGames released the public beta for GameMaker Studio 2.

For a lot of the past year, I’ve been sticking with GMS1.4 in order to work on a project that isn’t yet ready to migrate to GMS2, but I’m also trying to use GMS2 when I can, to keep up to date with it and to get used to the changes. Overall, GMS2 is definitely better, and from a language standpoint GML is only slightly different, and what differences they’ve made to the language are all improvements.

On the IDE side, though, I constantly find myself wishing that YYG had made a less radical redesign of the user interface. I’ve had a number of issues with the new UI, from the way the new workspaces and chain-view windows work to the fact that saving works differently. But that’s not what I’m here to talk about today. (I’ll probably touch on those in future articles at some point.)

GMS2 Dark theme

The default Dark theme for GMS2.

One of the things that I haven’t been able to get used to with GMS2’s new IDE is the new Dark theme. For years, GameMaker has opted for a default IDE theme that uses light text on a dark background, and with GMS2, YoYoGames took this concept to its logical extreme, opting for a black background and white-on-black icons and white label text everywhere.

GMS1 GMgreen theme

The default theme for GMS1, GMgreen, was likewise dark.

I didn’t mind the dark grey background of the window panes of GMS1.x, and the resource tree’s pane used black text on a white background, and the code editor’s dark grey background with colorful, syntax-highlighted text, and the toolbas with their colored icons. While it’s not the standard Windows theme colors, it’s usable and reasonably attractive, and if you’re the sort of person who prefers to look at light text on a dark background, it’s quite good.

And to be fair, GMS1’s IDE definitely had its failings. Certain windows were “modal“, meaning that you could not switch focus to any other part of the UI when that window is open, when there was no good reason for them to be. And the user interface for the marketplace My Library had terrible performance-killing bugs with large manifests, which makes it all but useless.

But with GMS2, I feel the Dark theme has gone overboard with being too dark, particularly with the toolbar button icons. Being white-on-black only just makes them harder to read and harder to distinguish from one another, and this slows me down when I try to use GMS2, and this is frustrating, since the whole point of the tool is to make me more productive.

There has also always been a light theme that YoYoGames provides “out of the box” with GameMaker, in case you’re the sort of person who prefers to look at dark text on a white background.

GMS2 light theme

The light theme for GMS2, appeals to users who prefer reading dark text on a light background, but I still prefer something with a bit more color and contrast, and sharper outlines so I can easily differentiate between different parts of the IDE UI.

There are certain colors in the syntax highlighting that contrast poorly against a white background. These should be fixed, but YoYo’s attitude about it seems to be “you can fix it, so fix it yourself.” So they provide preferences that allow you to set the colors yourself if you want to. So, great, you can have exactly the color scheme you want in the code editor, isn’t that wonderful?

The problem with this is, if you want to take screen captures of your IDE and share them with others, your non-standard code highlighting will be apparent to your audience, and may hinder in their ability to parse the text. It’s hardly surprising that we become dependent on the syntax highlighting we see all the time, to the point that once we get used to it, someone else’s color scheme will look “wrong” to us and become more of a hindrance than an assist.

If you want a full makeover for your IDE, you have to go beyond the syntax highlighting colors, and create your own IDE theme. Doing so will give you full control over the appearance of the entire IDE. The downside is that YYG doesn’t support anything but their own themes, so if their themeing templates ever change, breaking your custom theme, you’ll have to fix it. Also, it’s possible that installing updates can either a) overwrite the theme directory, so keep a backup of your theme files. Fun! So instead of spending all your time doing game development, you can take a slice of your time hacking the IDE to do things that arguably the vendor should have gotten right, or at least implemented better so that you wouldn’t have such a temptation. Hopefully this doesn’t happen regularly.

While I like tools that can be customized, I prefer to focus on developing games, not customizing the tools that make games. Too much customizing turns me from a game developer into a game development tool developer. While the skillsets overlap, I really want to maximize the time I put into being a game developer.

Naturally, this has head to some third parties releasing their themes, sharing them with the user community, thus saving you from having to do all the work yourself.

GMS2 VS blue custom theme

Based on the default Windows theme colors and Microsoft VisualStudio, this custom theme called VS blue, is excellent. Very readable, and easy on the eyes.

I really like this Visual Studio-inspired theme. The missing option that YYG did not provide in addition to their Light and Dark themes was a “native Windows” theme, and this is pretty much that. In fact, I would love it if YoYo would embrace this theme, give the developer who created it, iluvfuz, a reward, and make it an officially supported theme. This would erase 100% of the snarkiness in this article.

It’s very similar to the GM8 theme for GMS1.x, in that it uses mostly system colors for the window chrome. The GM8 theme was my favorite on GMS1.x, so of course the VS blue custom theme is my favorite for GMS2.

GMS1 GM8 theme

The GM8 theme for GMS1.x was my preferred way to theme my IDE, because it mostly followed the Windows standard theme colors.

 

GameMaker Studio 2: My Library UX suggestions

I’ve been sitting on this suggestion for a while. I raised these concerns on the GameMaker forums some months ago, around the time GMS2 was in beta, or shortly after.

Don’t get me wrong, GameMaker Studio 2 has a much better interface for the Marketplace My Library as compared to GMS1.x, but there are a few things that I would love to see improved with it.

  1. The “New Version” category shows literally only those items for which you already have some version downloaded, and for which a newer version is available for download. It does not show items for which you have purchased, but never downloaded before. This makes hunting for newly purchased assets in My Library a HUGE pain. I contend that when you have just purchased something and never have downloaded any version of it before, ANY version that exists at all is a “new version” and thus should show up in the New Version category. Please have everything that is “new” (whether a brand new purchase, or a update to an already-downloaded asset) show up here.
  2. Every time I hit the Download button for an asset in the New Version category, My Library jumps to the All Assets view, then shows the asset being downloaded. Now, if I want to go back and find the next asset in New Version to download it, I have to go and click on the New Version category to go back to it, each time. Again, this is a huge pain — a lot of extra clicking back to the New Version category for each available updated asset that I need to download.
  3. There’s no “Select all”. So when I have all these updates awaiting download, I have to download each of them, one at a time. It would be so much better to check a checkbox to select all available new assets, and then download them all with a single click.

GameMaker Studio Tutorial: Getting Into Shaders

Shaders have been a part of GameMaker Studio for a while now, having been introduced in 2014. Since their inclusion, I have mostly remained mystified by them, with a vague and cloudy understand of what they are and what they can do, and haven’t used them at all. That will [hopefully] start to change today.

As always, when try I learn something new in programming, I find that writing up what I’ve learned helps me to remember and keep my learning organized. I like to publish my notes so that they can help others, and so that others can find errors and make suggestions for better ways to do things. I’m still very new to working with shaders, so I’m not trying to put myself out there like I’m some kind of expert, but here’s what I’ve been able to learn about using shaders with GameMaker Studio so far:

Shader basics

First, we need to understand what a shader is. A shader is a specialized program that processes graphics. Shaders are executed on the Graphics Processing Unit, or GPU, which is specialized hardware for accelerated graphics processing. Thus, shaders are very fast. As well, since they work on the GPU, using shaders will free up the CPU to do other tasks, which can further help to improve the frame rate of your games.

This sounds like enough of a reason to want to use shaders, doesn’t it? Well, it gets better. The main thing about shaders is that they can do amazing visual effects, which will can make your game look better, but can also play an active role in how the game plays. For example, you could use a shader to handle the graphical processing of a special view mode in the game, such as night vision or x-ray vision. One of my favorite shader-based gameplay mechanics that was centered on the use of shaders was Daniel Linssen’s Birdsong, winner of the “Overall” and “Theme” categories of the Ludum Dare 31 compo held in 2014. The theme of LD31 was “Entire Game in One Screen”, and Linssen’s approach to this was to create a giant one-room game, that was crammed into a single screen(no scrolling), and, using a fish-eye lens effect done with a shader, magnify the area where the player is so that it was large enough and detailed enough to be playable.

There’s virtually no limit to what graphical effects you can come up with using shaders, other than the limits of your imagination and of course your programming and math skills. It also helps to understand how computers work with graphical concepts such as color, pixels, binary math, and so forth. Additionally, scientific knowledge in disciplines like optics can be very useful. Shaders have their own specialized programming language that they are coded in — actually there are several related languages to choose from. Because of this, shaders are considered an advanced topic in programming, and there are numerous hurdles to surmount in order to be able to write them yourself.

That said, shaders are re-usable bits of code, and so one of the first things you can do when you start getting into shaders is to simply use pre-existing shaders that have been written by other people.

Getting Started with Shaders

Before you can use shaders, you’ll want to familiarize yourself with a few concepts.

Shader references

Here’s links to the relevant pages in the GMS manual on using shaders in the context of GameMaker Studio:

GMS1:

Shaders Overview

Shaders GML reference

Shader Constants

Tech Blog Shaders tutorial: 1 2 3 4

GMC Forums shader tutorial.

GMS2:

Shaders Overview

Shader Constants

 

Other shader resources (general)

Language References

The four shader languages that GMS supports are: GLSL ES, HLSL9, HLSL11, and GLSL. Which one you need to learn and use will depend on your target platform, but for this article we’ll focus on GLSL ES, since it supports the most target platforms (all of them, except Windows 8).

GLSL ES language specification

HLSL language specification

I haven’t gotten into the shader languages enough yet to know why you’d ever pick HLSL over GLSL, but presumably there must be some advantage to using HLSL when targeting Windows platforms, either in terms of correctness or performance. Otherwise, I would think you’d be better off just sticking with GLSL ES and be compatible with the most targets.

Tools

Shadertoy Shadertoy is a wonderful website that allows you to play with shader programming, running them in your web browser. Then you can share your shader creations with the community of users of the website, and in turn you can demonstrate and use shaders written by others.

Other graphical concepts in GameMaker, and how they relate to shaders

It’s not a bad idea to review and understand the entire chapter on Drawing in the GameMaker documentation. There are many concepts that you will need a working knowledge of in order to understand how to use drawing to its fullest capacity, and to get things working together smoothly.

But the manual isn’t the end of the story. Often I find that the manual doesn’t go far enough to explain how different concepts work. The sections on blend modes and alpha testing are particularly inadequate by themselves. The manual also doesn’t go very far to demonstrate or suggest how different features and functions can be connected to one another. That’s for the user to infer, and verify through experimentation. This is great if you are creative and love to experiment and discover. On the other hand, there’s so much that has already been figured out and discovered by others, and it would be nice if that was all documented in an easy to search reference somewhere.

Read the entire manual, cover to cover, if you can. Create little demo projects to test your understanding of what you’ve read, and figure out how to do things. Read it again. And refer to it whenever you need to. There’s no substitute for reading and understanding the manual. I’ll still touch briefly on the major concepts here, for summary:

Surfaces

All drawing happens somewhere, and in GameMaker that somewhere is called a surface. Behind the scenes, a surface is simply a chunk of memory that is used to store graphics data. You can think of it as a 2D grid of pixels, stored in the program’s memory, and drawn to the screen only when called for. You can also think of it as virtual “scratch paper” where you do some work “backstage” and then bring it out to use in the game when needed.

The application has an Application Surface, and by default everything is drawn here. But you can create other surfaces, which you can work on, composing a drawing off-screen, until you are ready to draw it to the screen. As you might imagine, there are countless reasons why this is useful, and endless ways to make use of surfaces.

Surfaces are relatively easy to use, but are considered an intermediate level programmer’s tool in GameMaker, for a couple of reasons:

  1. Surfaces consume memory, and need to be disposed of when no longer needed.
  2. Surfaces are volatile, and can be destroyed without warning, so should not be assumed to exist. For example, if the player switches focus to a different application, or if the computer enters sleep or hibernation mode, or if the game is saved and resumed, surfaces that were in existence at the time the application was last running may have been cleaned up by the operating system, and will need to be re-created if they don’t exist.
  3. All drawing must happen in one of the Draw events. If you try to use draw functions in other events, it may or may not work correctly, and this will vary from computer to computer. I once made a game where I did some set-up in the Create Event of an object, where I created a surface, drew to it, and then created a sprite from the surface, and assigned the newly created sprite to the object. It worked fine on my computer, but when other players downloaded my game to try it out, it did unexpected things and the graphics were glitched. Fortunately, I figured out what the problem was, and fixed it by moving this sprite creation into the Draw Event. Once I did this, the game ran correctly on everyone’s computer.

Drawings done to surfaces can be run through a shader, as input, and thereby be processed by the shader. In short, a surface can be the input image data for a shader, and the output of the shader will be the processed version of that surface, transformed by the shader.

Blend Modes

For a long time, long before GMS introduced shaders, GameMaker has provided blend modes. Blend modes affect what happens when GameMaker draws graphics over existing graphics that were drawn previously. Normally, when you draw something, it covers the pixels that were there before. But, by changing blend modes, you can do other things than simply replacing the previous pixels with new pixels, blending the old and the new in different ways according to the mathematical rules of whatever blend mode you had selected.

To be honest, I’m not sure what useful purpose there is for every blend mode. It would be great if there were more tutorials showing useful applications for them, especially the obscure ones that I don’t see used much, if ever.

The most commonly useful blend modes, in my experience, are bm_normal and bm_add. Normal blending is the default drawing mode, and is what you use 99% of the time in most games. Additive blending creates vivid glowing effects, and is particularly lovely when used in conjunction with particle systems to create glowing systems of overlapping particles, especially when you are drawing translucent pixels (using alpha < 1).

Blend modes are also useful for creating clipping masks. For more info on that, there are some good tutorials already written on how to create a clipping mask using surfaces and blend modes.

Some of the first questions I had when Shaders were introduced were: What do we do with blend modes now that we have shaders? Do we still need them? Can we combine them with shaders, somehow? Or do shaders make blend modes obsolete?

Basically, as I understand it, the answer seems to be that blend modes were kind of a limited predecessor to shaders, and enabled GM users to achieve some basic drawing effects simply, without exposing GM users to all that highly technical stuff that shaders involve, that I mentioned above.

Anything you could do with blend modes, can be done with shaders instead, if you wanted to. That said, if all you need is the blend mode, they’re still there, still supported like always, and you can go ahead and use them. They’re still simpler to use, so why not.

One thing to be aware of, though, when using blend modes, every time you change blend mode in GameMaker, you create a new “batch” of drawing. The more batches, the longer GM will take to draw the game each step. Thus, many batches can slow drawing down tremendously. This is an area where you may need to focus on optimization. And if you’re that focused on performance, then it might be worth looking into a shader-based approach instead.

Once you’ve become sufficiently comfortable with shaders, you may not have as much need for using GameMaker’s drawing blend modes.

D3D functions

I have not used GML’s d3d functions, much, either, so my understanding is very limited. Basically, as I understand it, the d3d functions in GameMaker wrap Microsoft’s Direct3D drawing functions, and enable drawing with more sophistication than is possible with the basic GML draw functions such as draw_rectangle, draw_line, draw_ellipse, etc.

Despite the name, the Direct3D functions are useful for 2D drawing as well as for 3D.

This article will not cover using GML’s d3d functions, as we’re focusing on shaders. But as any graphics in your game can be used as input into a shader program, anything you draw using d3d functions can become input for a shader to process.

Particles

Particles are “cheap” (efficient) graphical effects that can be created without having to instantiate an object. They are efficient because they do not incur all the processing overhead that comes with a full-blown object. Huge numbers of particles can be generated at very little cost. These can be used for all sorts of effects, so long as the particles do not need to interact with instances in the game, such as triggering collisions. Typically, particles are used for things like dust clouds, smoke, fire, glowing “energy plasma”, haze, rain, snow, and so on to create additional atmosphere.

To use particles, you have to create a particle system. As with Surfaces, particle systems take memory, and need to be disposed of when no longer needed, in order to free up that memory. Full detail on how to set up and use particle systems is beyond the scope of this article.

Several external utilities have been developed by GameMaker users over the years to make designing, building, previewing, and testing particle systems easier, and these are highly recommended.

In conjunction with shaders, I don’t know that there is any direct interplay between particle graphic effects and shaders, but certainly a shader may be used to further process a region of the room where particles exist, to create more sophisticated effects.

Using Shaders in GameMaker Studio

Right, now that we’ve introduced the concept of what a shader is, and reviewed the other main graphics concepts in GMS, here’s where we get to the heart of how to use shaders in GameMaker.

A shader is a pair of shader-language programs, consisting of: a vertex shader, and a fragment shader. Vertex shaders deal with the edges of the drawn area, while fragment shaders deal with the insides.

Let’s say you want to use a shader program that has already been written, perhaps by someone else. All you need to do is use this code in your draw event:

shader_set(my_shader);
 //draw stuff
shader_reset();

So, it’s a lot like drawing to a Surface. With surfaces, first you set which surface you want to draw to, then you draw, then you reset so that drawing resumes to the application surface. With shaders, you set the shader you want to use, draw the stuff that you want to be transformed by the shader, then reset to normal drawing.

Everything drawn between setting the shader and re-setting back to non-shader drawing will be drawn through the shader program.

Easy enough, right? Well, there’s slightly more to it than that.

Uniforms

“Uniforms” is a strange term at first, and was where shaders started to seem strange to me. This is a term that comes from the shader language itself. The GameMaker manual talks about them in a way that assumes the reader is already familiar with the concept, and doesn’t go into a lot of detail explaining it to newbies.

In essence, “uniforms” are input variables that can optionally be passed into a shader that is designed to use input values. Once you understand what a uniform is, it’s not that difficult a concept. You can read more about them at these pages:

The gist of it is, when writing a shader program, when you declare a variable, you can declare it to be a uniform variable, which means that the variable can be accessed from outside the shader, thereby giving the program that calls the shader a way to change the shader’s behavior during execution. To do this, you can’t just refer to the uniform variable by name; you have to get the uniform variable’s memory handle, using shader_get_uniform(nameOfUniformVariable), and then change the value of the variable using shader_set_uniform_f(nameOfUniformVariable, value). Uniforms are actually constants within the shader’s execution scope, so once a value is passed into the shader from the outside and it is set as a uniform, it cannot be changed (the value could be copied to another variable, and that variable could then be modified, though.)

If you’re using a shader that has uniforms that you need to set, it’s done like this:

u_color1 = shader_get_uniform(my_shader, "f_Colour1");
u_color2 = shader_get_uniform(my_shader, "f_Colour2");

shader_set(my_shader);
shader_set_uniform_f(u_color1, 1, 1, 1);
shader_set_uniform_f(u_color2, 1, 0, 0);
//draw stuff
shader_reset();

There are actually a few uniform functions in GML:

Conclusion

That’s about all I know about shaders, for now.

As I get more familiar with using shaders, I’ll update this with more complicated examples, such as (possibly):

  • How to use multiple shaders on the same drawing (eg chaining the results of one shader’s transformation of some drawing into the input for a series of shaders).
  • Other stuff…

How to write shaders in GLSL, if I ever do it, will be a topic for its own article (or series of articles).

Save Network Neutrality Again

The arguments are no different than they were the last time.

Network Neutrality is the bedrock of freedom in the Internet Age. Network neutrality is an essential requirement to protect freedom of speech in the Internet Age.

Network neutrality means that the network shall treat traffic passing through it the same, regardless of source or destination. A neutral network does not favor one customer over another, giving priority to the one favored by the network while blocking or reducing access to others.

It doesn’t mean a company can’t create service tiers and sell more bandwidth for more money. ISPs already do this. Companies that need to use a lot of network traffic pay for the capacity already.

Network neutrality is the way the internet has worked since the very beginning. Over the years, there have been a number of efforts made to end network neutrality. ISPs have relentlessly tried to end network neutrality because doing so will give them a huge amount of power over the internet, which they will use to make themselves very rich and control the world’s data traffic. The consequences of this would be drastic and dystopian.

Internet Service Providers should be treated as a utility and as a common carrier. We need this put into law so that it does not come up every few years when regulatory appointees change. That will not happen with our current congress or president.

But as I write this, the freedoms protected by network neutrality are once again under assault by the FCC. Current FCC Chair Ajit Pai is a former executive with Verizon, and in his current role is a fox guarding the hen house. He clearly is working in the interest of large ISPs and against the public good. If he and two others vote yes to end network neutrality, it’s all over.

All we can do is tell them we don’t want that. Contact the FCC right now and leave a comment. Call, write, fax, and email.

While you’re at it, tell Congress that you want legislation that will protect network neutrality.

12-month license moves GameMaker Studio toward SaaS business model

YoYoGames recently announced a new edition of GameMaker Studio 2. Called the “Creator Edition”, it is $40/year subscription.

I’d pointed out earlier in the year that YoYoGames had taken all the necessary steps to make ready to abandon perpetual licensing, and this announcement proves my assessment was right on. See, reddit? I was right.

Permanent subscriptions are still offered starting at $99, although the software license is active only as long as the machine it’s installed on is able to log your YoYo Account in with YYG’s license server. Which is to say, if they want to they can disable your license, and if they go out of business, or if the license server goes down, you won’t be able to use the software.

The Problems of Perfection

The auto maker Lexus used to have a marketing slogan, The Relentless Pursuit of Perfection.

Lexus Pursuit of Perfection

This sounded really good, right? As a customer, you would like perfect products. You don’t want to spend money to acquire problems.

But from a making perspective, perfection is often the enemy of good. It turns out that perfection, the idea of perfection, as well as the pursuit of it, has a lot of problems.

Perfection as Impossibility

First, everyone has heard the cliche that perfection is impossible in the real world. No matter how hard you work at something, there will always be some tiny deviation from the ideal defined by the design or blueprint. We build things to tolerances, understanding that exactly hitting a desired metric is practically impossible. But we don’t need exact, we need good enough, and good enough can be as precise as it needs to be for the application. In some situations, tolerances are more critical than in others.

Chasing after something impossible can waste a lot of resources, and the futility of it can be frustrating. People tend to put an undue amount of pressure on themselves, feel negatively about themselves when they prove incapable of attaining such a high standard, and react negatively to mistakes and criticism, or even hide or deny problems rather than own up to them, which is exactly the wrong way to respond to these learning opportunities.

Subjectivity and Contingency

Second, what is perfect? People have different needs, different opinions. Needs change as the times change. What is ideal for one situation is likely not ideal in another. There’s no such thing as a one size fits all solution that is a perfect fit for everyone. So perfect then means something more like highly customized, bespoke solutions.

But the amount of resources that are required to create such customized solutions for everyone are often better allocated toward more general solutions that are a better fit for the needs of larger groups of people. Imagine an auto maker taking your exact body measurements, and then using those to build a car with a seat that fits you — and only you — exactly perfectly, and where the dashboard and mirrors are likewise laid out in a perfect way so that you can see all the gauges and reach all the controls without straining.

But this would make a car that is many times more expensive to design and build, is much more expensive to maintain and support, and which has limited resale value. It’s much more cost effective and beneficial instead to design a car with adjustable seats, that can fit 99% of people, is cheaper to design, manufacture, and support, and can be sold to anyone, not just people who are close enough in size to you that they can use your bespoke seat.

And even then, a car that is good for commuting isn’t going to be the best for cargo hauling. Or have the best possible fuel efficiency. Or be good off-road. Or have the safest crash rating. The perfect car contingent upon the usual need will score high in several of these important categories, but different users will value categories according to their own criteria, and may have a variety of use cases that they need to cover with a single vehicle, that cannot possibly be perfect for every need and every situation.

The foil of Trade-Offs

As is apparent in the car example above, most things are complex enough that there are always going to be trade-offs. For example, a structure may need to be strong, but often weight is another important factor, and making something both strong and lightweight is a challenge because increasing one tends to diminish the other. Thus, a perfect thing in all aspects is not possible, but perhaps we replace the idea of a thing that is ideal in all categories with the idea of a thing that is a perfect balance of tradeoffs among the categories. But this sort of perfection is a lot more dependent upon subjective and contingent parameters. A set of compromises isn’t the usual concept of what perfection means.

Diminishing Returns

The more you want to improve something, the more effort needs to be put into improving it. At some point, the amount of improvement that you gain for the amount of effort that you put into the improvement will start to decrease. At some point short of perfection, the value of the improvement isn’t worth the effort.

For example, I could study for 2 hours and get an A on a test. I could study an additional 4 hours, and boost my A to an A+. Perhaps by studying 8 hours, I could score 100% on the test. Is it worth it? Or would that 6 hours be better spent getting an A on three other tests?

Can you hear the difference between a $500 stereo and a $5000 stereo? And if so, is it worth spending 10x as much to get that difference? Is a $200 CPU good enough, or do you really need the $1000 CPU that delivers merely 2x the performance? Sometimes it might be. But the situations where extreme high performance is required are comparatively few.

Perfecting a local maximum

Perfecting a thing requires focusing on the thing. But what if the thing isn’t the important thing, and the thing is merely a means to an end? Say you want the best bicycle. So you spent all your time working on making bicycles better, better, and better still. So now you have a bicycle that’s nearly perfect. In the early days of pedal-power, the design of bicycles was much different.

Velocipede

A design known as the velocipede featured a large front wheel driven directly by pedals, and a tiny trailing wheel for balance. This design was popular for a time, and in the quest for ever greater mechanical advantage for producing higher top speeds, the front wheel diameter was increased. This gave rise to diminishing returns, and negative tradeoffs in balance and safety.

Local maximum

Prior to the invention of the modern bicycle, velocipede design had evolved to a point where it could go no further. Wheel size was so large that any larger would be impractical or impossible. This was a local maximum for the design, the zenith of a particular design concept, but still better designs were possible.

BicycleEventually the design died out when an inventor hit upon a better design, the modern bicycle, which featured two wheels of identical size and a chain driven gearing system that allowed higher gearing ratios than were possible in the now-obsolete direct drive velocipede designs.

Missing the next-gen paradigm shift

But what if your goal isn’t really to ride bikes exactly, and you simply want to go from point A to point B quickly? There might be other vehicles that are better, but if all you can think of is a better bicycle, the idea to invent them may never occur to you. Famously, bicycle mechanics Wilbur and Orville Wright pioneered the airplane. If they’d been focused exclusively on perfecting the design of the bicycle, they never would have come up with their flying machines.

And paradigm shifts continued to happen with flight, from bi-planes to monoplanes, from propeller driven airplanes to jets. At each stage, what worked for the current generation could only be taken so far, and to make the leap to the next generation required a re-thinking and a discarding of older concepts in order for even better concepts to flourish. The old continued to serve while the new was developed, but eventually once the new concept was proven a success, the old fell out of favor. And yet, at no stage was anything truly “perfect.”

The relentless pursuit of improvement

This is not to say that we shouldn’t make things as good as you can. But whatever we do, is always done within the scope of certain constraints. Being aware of those constraints, we should make intelligent choices in order to maximize value, without being trapped by an idea of a perfect thing.

csanyk.com © 2016
%d bloggers like this: