One difference I’ve became painfully aware of: for a web developer, each user request translates, in the end, into a query in the database. For a desktop developer, each user request translates first to a query to an object previously loaded in memory.
The problem that I have is with a web application that needs a cousin desktop application to allow people with poor Internet connection to locally cache the data from the web. The application uses a tree-like menu. In web context, a click on an entry in the menu translates into a HTTP get. I know that behind the scenes a query is sent to the database.
In desktop context, a click on an entry in the menu translates into a request to a business object storing a tree of menu entries. Every time I have to touch that part of the application I wonder the application should have got the information from the database (the web database in some scenarios and the local database, in other) instead of reading it from the database once and creating the tree-like data structure. I think the application does response quick to user commands, but there are scenarios where the web database changes, the user is still using the desktop application and he wonders why his menu looks different (the answer is simple: his local copy of the menu does not match the server menu).