I'm not done with my game yet, but I find it fairly easy as I'm doing both at the same time. What I do (don't know if this is the best way, but I find it easy yet powerful), is abstract the input for different devices to different classes. Note that I'm using LibGDX myself, so I'm already working in Cross-Platformness as I build my game.
I, for example have a AndroidInputManager and a DesktopInputManager (and when I get my OUYA also an OUYAInputManager!) which both have a abstract superclass InputManager. This superclass will enforce methods like 'Jump()' or 'walkRight()'. How the actual implementation is on the specific device is decided entirely by those subclasses. Somewere in the constructor of my GameScreen I detect on which platform I am and instantiate one of those subclasses to be used in the rest of the game:
//Note that this can probably be done in many different ways and I by no means claim that this is the best method! :)
switch(Gdx.app.getType())
{
case Desktop:
input = new DesktopInputHandler();
break;
case Android:
input = new AndroidInputHandler();
break;
}
//Further down in the code:
public void someFunction()
{
if(input.Jump())
Blah();
{
Now if you want to implement this method after already doing it all solely based on touch, it would probably take some work extracting all input-functionality, but no real difficult work.
My game is not a port, but rather a game built from the start with OUYA in mind. But previously I've been making games for touch screen devices.
Getting the input is a piece of cake on OUYA, but working out how to best design (esp. the UI) with controller rather than touch in mind has been a little bit more effort.
I'm developing my game for OUYA, mobile and desktop at the same time using libGDX. For me adding controller support was easy (easier than setting on-screen buttons to do the same) as was keyboard support. I'm only afraid about all those PS3/PS2/XBox etc. controllers that have different button and axis codes.
It depends on the game type. If you are simply simulating a controller on screen, then switching to a physical controller is a simple task. However, if you have a touch-interface game (lets use a very very very simple example of "chess") switching to a controller paradigm is a whole new world... you have to fairly radically rethink how the interface for your game will work. You go from detecting touch events at certain coordinates (or objects), to needing a cursor that's moved around with the controller [edit: and things like dragging don't really have an appropriate parallel]. You have to use buttons to invoke and navigate menus vs. touch events at certain coordinates/objects.
It gets far more complex the more complicated your game gets, and the more divergent the two control schemes are.
Comments
Getting the input is a piece of cake on OUYA, but working out how to best design (esp. the UI) with controller rather than touch in mind has been a little bit more effort.
Website
I made the OUYA exclusive games Cube and Creature and Hellworm!
evgiz.net
It gets far more complex the more complicated your game gets, and the more divergent the two control schemes are.