We often think about building intelligent machines to be sentient beings. In the movies, we always immediately give these beings the keys to our nuclear arsenal. This usually ends poorly. Creating artificial intelligence is different than creating artificial life. We need artificial intelligence to create artificial life, but don't need artificial life for a machine to be intelligent. Creating a self aware being like HAL 9000 from 2001 A Space Odyssey is interesting and important work, but we're missing something by grouping them together.
Let's say I want to build a fully autonomous car so I can get in it and take a nap while it whisks me away. I need it to be intelligent with respects to getting me to my destination safely. It needs to make decisions like not hitting that jaywalking pedestrian and routing us around that traffic.
An autonomous car does not need to understand how to load my dishwasher. It definitely doesn't need to be able to sit with me in my living room and discuss literature.
So an intelligent machine is one that can make decisions about it's particular problem domain.
If you we're going to design a machine to do something that required a fair amount of intelligence, what would it do?