Unity Debug Diaries: Unity & Apple TV
So I have been doing some initial tests using the Apple TV and so far so good. If you have gotten into the Unity Apple TV beta then you know the score by now. The current beta program has a modified version of Unity 5.1 that includes Apple TV as an iOS device in your build settings. If you want into the beta program, contact your local Unity rep and see what they can do. To get in we had to actually bug David Helgason, so I am not saying it will be easy but there is something really cool about having your stuff up on your big screen.
Right, now for the lowdown. I had 2 major problems, input and plugins. To get my initial build to work I had to strip out all my plugins… everything that had a call to external iOS functionality, even the ones built into Unity. Don’t worry about your internal plugins, things that exist with in the engine and player, they should be fine. tvOS support. So your first step in getting a build running on your Apple TV would be to
- Update all your iOS plugins.
- Systematically make builds and test on device, ripping out each plug-in until it works. If you read the XCode error message it will always point you to the offender.
Once you build successfully, its time to deal with your input. Unity has allowed for two main ways of dealing with input. 1) You can treat the remote like a game pad or 2) You can use the touch input from the remote. Now in both cases, the input is not your standard iOS touch input and therefor you will need to implement either ui element indexing (like pressing tab in a form) or implement a cursor. I did the latter because it was the easiest and covered all my use cases. Remember, people can plug in their game controllers to the device so it good to look at the Apple TV more like a game console… can someone say WII? One thing to note is that I have my own input system for buttons and the like so implementing this was a cake walk. I don’t know what it would be like using Unity’s GUI system.
You can hook up your cursor to the gamepad input using Input.GetAxis() but you can also use the touch pad data. I would recommend this because it gives you more control. The game pad axis only gives you XY values from -1 to 1 but the touch pad actually gives you values based on the size of the screen you are rendering to. When you touch the touch pad it initially registers the tap as the “center”. This equates to your initial value to be 960, 540, basically 1080p / 2. So when you create your movement algorithm, record your pressed position and do a delta on the subsequent readings from the track pad. One thing to note is that to action something, you will still need to track the up and down of another button. Input.GetKey will do the trick but you cant use your input touch to trigger actions… well you can but it would be weird. For more specifics of button mapping and such, you will get that when you get into the beta program. Not sure how much Unity is cool with me discussing here but its all key-codes.
I wanted to shout out to the folks at Unity, well done for making it super easy to get on yet another platform! To all you mobile devs, its time to make something for the big screen!