Search engines are not human beings, they crawl websites based on their content rather than design, sound and animation. All search engines like Google, Yahoo, and Bing etc. crawl web pages by looking at the items and the text, the website has.
For crawling the web pages, search engine has a crawler, spider or Googlebot (in case of Google). These crawlers follow the links from one page to another, and index every page they find in their way. A crawler can not visit a site daily, as there are many web pages on the web.
All you need to do is to check that a crawler checks in your website. As they do not index design, animation, video etc. there is software called spider simulator checks if these things are being viewed by the crawler.
After crawling a web page, the web page is indexed which is then saved in a database to retrieve later. Indexing refers to identify the keywords and expressions that describe the web page.
When a search query is entered by a user, then the search query is compared with the indexed pages stored in the database. There are many pages in the database relevant to the search query, so the search engine calculates the relevancy of the indexed pages. For calculating the relevancy, algorithms are being used by search engines. Search engines like Google, Yahoo, Bing change algorithms periodically, so SEO is used to make changes in a website in order to get attention of users by remaining on the top of the list of SERPs (Search Engine Result Pages).
Then at last search engine retrieves the results and displays them in browser.