This is the third of a four-part story documenting my somewhat excruciating and wholly uncharted experience developing an indie game for Panasonic’s obscure VIERA Connect television app market several years ago.
In the first part of this postmortem, I detailed the pains of becoming a VIERA Connect third party developer and struggles of setting up the rigid development environment required by Panasonic.
In the second part, I documented the development process of actually creating a game using the often restrictive VIERA Connect framework.
Time to Submit the App!
Over the course of approximately three weeks, I learned the ins and outs of the original VIERA Connect API (Ajax-CE) and used it to develop a Snake game. In coming up with a name for the app, I observed that the Pansonic app market sorted apps alphabetically by title, with no option to sort by latest releases or popularity (official partners like Game Loft were given preferential treatment, with their apps pinned above the rest). With that in mind, I decided upon the name Apple Muncher to ensure that it would appear on the first page of available games for added visibility. In this short window of time, I had developed a full-featured game with 25 levels, multiple themes and an online high score system.
As indicated in previous installments, I was one of the first independent developers to jump aboard the Panasonic app market. At the time, there were less than two dozen apps available across all categories and almost all of these apps were from official partners like Netflix, Game Loft and Amazon. Upon inquiry, I was told by Panasonic that these partners were allowed special access to a “native” API that was not available to third party developers. As such, the big players were able to port their existing Web apps over with little hassle and could leverage far more capabilities than permitted by third party members.
With my app ready to go, I ventured over to the app submission section of the VIERA Connect portal. There was limited information provided apart from a few general guidelines and the submission form itself. I assumed the review and approval process would be comparable to iTunes and Google Play, but had to go in blind as no outside information existed about Panasonic’s app vetting process.
The most difficult part of asset generation came when having to prepare a set of screenshots. Since the only way to execute the app was via the television set itself, there was no easy way to capture the screen. On computers and mobile devices, the print screen functionality makes this an effortless process. However, even an HD capture card connected to the TV would do no good, since the entire app was executed on the television’s internal OS with no video out capability. Since the television supported SD cards, a welcomed feature would had been a ‘screenshot’ function that could be called from the code to save the screen’s current view to a PNG. Alas, that functionality did not exist.
My approach to acquiring the mandatory screenshots was a mix of painstakingly recreating the scenes sprite-by-sprite using graphic editing software, and photographing my television set’s screen and cleaning it up as best as possible in post-process. In the end, I had 4-5 screenshots that had to be scaled down to 480×270 for the submission form, and several different icons at 360×252, 160×112 and 256×60. The various icons are used on different screens of the app portal once the app has been installed.
The rest of the required metadata was mundane: Title, Description, Price, Available Countries, Supported Devices and so on. I set the price to $1.99 USD and indicated that it would work on all VIERA Connect capable television sets, which at the time was really only a single generation of sets. Finally, I submitted the form. I received an automated email confirming the submission but with no additional information about how or when it would be reviewed. The email simply said “Application created (automatic notification).”
When submitting the app, I made careful considerations about the price point. Realistically, I had no knowledge of how many VIERA Connect capable television sets had been sold and, in turn, did not know what the user-base would ever conceivably amount to. I researched Panasonic press releases, stockholder data and other avenues yet was still unable to ascertain with any grade of certainty how many compatible sets had been sold.
In the mobile and PC markets, freemium and free-to-play revenue models have become so popular for good reason. By releasing an application free-of-charge, the potential exists for many more users to discover and try it out without any financial investment required. Revenue from free applications can still be earned any number of ways including via in-app purchases and advertisements. I have had great success in other markets by releasing free ad-supported apps that offered optional paid upgrades.
Unfortunately, the Panasonic API did not offer any sort of revenue-generating capabilities for free apps. There was no way to include third party advertisements or in-app upgrades. Likewise, no ability existed to allow a time-based trial so that users could enjoy the full app for a limited amount of time before having to purchase it (a much-welcomed feature on the Microsoft app store for Windows and Xbox).
Given the fact that I had already invested a month of my time and approximately $130 on the developer license, the only feasible way to make any return on my investment was to charge for the app. Other apps on the market cost about $1.99 average, so that is what I went with. In this way, if I could sell even 100 copies I would at least break even after all fees and expenses.
A couple of days after I submitted my app, I was approached by a member at Panasonic who encouraged me to make a free version of the app. They only supported paid apps in the United States and Europe, whereas the free version would be available to “100 countries” and instantly open the doors for “tens of thousands of players.” Still unclear was how many potential users existed within the United States and Europe alone, but I was reluctant to rework the app as a free version and go through the entire submission process a second time when my first submission hadn’t yet been approved.
Required App Documentation
The first correspondence about my app’s submission came two days later, where a Word document attachment advised me to complete more detailed documentation for the app. This paperwork was about seven pages long and required the creation of an application flow diagram, sectional information, technical requirements, cheat codes to allow the testers to progress through the game, screenshots and more.
With the documentation complete I sent it back via email. Four days later, I got another email that my app status had been changed to “In QA.”
Approximately a week after my initial submission, my app went into the “quality analysis” phase by Panasonic. I had assumed this was not unlike Apple setting the status to “under review” after submitting an app through their market. However, that thought quickly went astray as I got the report back from “the first round” of my testing in the form of a detailed spreadsheet.
Round One Testing: Four Failures [January 30]
The spreadsheet I received included four reported issues across two categories including Message Prompts and Functionality.
- If there is no connection to the network or server, an error message SHOULD be displayed.
- The game speed is too fast for the lower level. Cannot go to next row. ↑→↓ / →↓← etc
- Name Textbox is not cleared after Pressing Cancel / Submit.
- Highlight Day week month year in scores
Three of the four listed failures were related to the user experience, and seemed heavily subject to personal preference. I and several others who tested the original game did not have any trouble navigating the snake across the levels at the default speed. The game also supports keyboard controls for more responsive input, for those who do have a USB keyboard hooked up. Even so, I agreed to amend this and subsequently added a ‘easy’ and ‘hard’ mode, where the hard mode is what the original speed was. I added some checking to see if there was a connection error and made it so the high score buttons were highlighted more correctly.
The reported issue of “Name Textbox is not cleared after Pressing Cancel / Submit” was a deliberate design decision I made when coding the game. In playtesting, I found that having to repeatedly enter your name at the end of each round was a nuisance and very time-consuming with the on-screen keyboard. In the majority of use-case scenarios, the same person would often play multiple rounds of the game in a row. As such, I intentionally retained the entered name until the user exited the app. They were still able to delete and replace the name as they needed, but as a user convenience I stored it in memory. This feature was seen as an issue by the QA person at Panasonic, who must have preferred to manually re-enter their same name every time.
[Sidenote: Panasonic did not publicly allow third party developers to interface with the SD card so there was also no advertised way to store any app data internally; once the app closed all data would be wiped clean. There also was no way to get a unique ID for each device to store and retrieve personalized data remotely in any easy manner.]
I took care of all of the issues and wrote my comments and arguments in the spreadsheet. I disputed the name issue and explained the convenience factor of storing the previous name, but still made some enhancements to it so anyone who did want to change the name could type it immediately without having to manually erase the stored value. I sent back the revised spreadsheet.
Round Two Testing: One Failure [February 2]
A couple days later, I received a second round of testing results. The three previously listed Functionality issues had now passed their requirements. However, they were still hanging on the network connectivity issues, and included these unlikely steps to reproduce the issue:
- Start the game > Unplug the Ethernet cable > Play the game > Submit a score
I had already worked in safeguards against a loss of connectivity to my server for any reason, but now was being told that if a person explicitly unplugs their network cable or adapter from the TV it should detect and warn of this in the app. I personally felt that if it came down to a total loss of Internet signal, the television firmware itself should monitor and alert the user since no aspect of the app portal or OS is accessible without connectivity. In either case, I spent additional time reworking the behavior so that if the user did unplug their network cable or otherwise lost total connectivity, then tried to submit a score, a more direct error notice would display.
With this and some additional refinements in place, I submitted the revised spreadsheet once again.
Round Three Testing: No Failures [February 8]
Finally after more than a week of QA testing and fixing, I got the good word that my app has passed the “Initial QA” and would “be sent to the QA team at Panasonic Headquarters for final round testing.” Say what? Apparently even after all of this there was still another round of vigorous testing to go. In this email notice I had to also confirm that it was an original creation and did not violate any copyright laws. Still, progress was progress.
Round Four Testing: Weeks Upon Weeks in Limbo [March 19]
This last bout of testing took a staggering 5 weeks to complete. During this time, I received a few other emails including:
- Paid app support was now available in Canada alongside US and Europe. [March 6]
- The app is officially now entering QA testing. [March 7].
- BlackBerry App World / RIM / Digital River contract data for managing the app sales. [March 12]
On March 19, my app was finally marked as Approved and Waiting to Go Live. Hurray! Well, it was approved, but still hadn’t gone live. So I played the waiting game a little longer.
App Status Updated: Finally Live! [April 1]
Finally, after more than nine weeks had passed since I first submitted the app, it went live on the Panasonic app portal. Keep in mind that my app was one of the only new submissions that would had needed to be tested as I was one of only a few third party developers to try out the platform. Meanwhile, Apple is able to test and approve apps in about a week’s time period despite having thousands of new apps to run through at any given time, Amazon generally reviews apps within 24 hours and Google within several hours.
I was just relieved to have made it through the unforgiving validation process. To Panasonic’s credit, the spreadsheets detailing the specific issues were thorough and far more informative than the details (or lack thereof) you get if your app is rejected during Apple’s approval process.
In the final installment of this postmortem, I will be detailing all of the post-launch events. This will include app usage statistics, revenue and further developments.