Future neuroprostheses will be tightly coupled with the user in such a way that the resulting system can replace and restore impaired upper limb functions because controlled by the same neural signals than their natural counterparts. A key component of these neuroprostheses is a brain-machine interface (BMI), which enables users to interact with computers and robots through the voluntary modulation of their brain activity. The central tenet of a BMI is the capability to distinguish different patterns of brain activity in real time, each being associated to a particular intention or mental task. This is a challenging problem due to the limited information carried by brain signals we can measure, no matter the recording modality. How then is it possible to operate complex brain-controlled robots over long periods of time? In this talk I will argue that efficient brain-machine interaction, as the execution of voluntary movements, requires the integration of several parts of the CNS and the external actuators. I will put forward principles to design neuroprostheses, which I will illustrate through working prototypes of brain-controlled robots and applications for disabled and able-bodied people alike.