According to Whatis.com, a spider is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "crawler" or a "bot." Spiders are typically programmed to visit sites that have been submitted by their owners as new or updated. Entire sites or specific pages can be selectively visited and indexed. Spiders are called spiders because they usually visit many sites in parallel at the same time, their "legs" spanning a large area of the "web." Spiders can crawl through a site's pages in several ways. One way is to follow all the hypertext links in each page until all the pages have been read.
2. Differentiate the various types of software agents.
Haag (2006) suggests that there are only four essential types of intelligent software agents:
• Buyer agents or shopping bots
• User or personal agents
• Monitoring-and-surveillance agents
• Data Mining agents
3. Identify various activities in e-commerce where software agents are currently in use.
Buyer agents (shopping bots) - Buyer agents travel around a network (i.e. the internet) retrieving information about goods and services. These agents, also known as 'shopping bots', work very efficiently for commodity products such as CDs, books, electronic components, and other one-size-fits-all products. Amazon.com is a good example of a shopping bot. The website will offer you a list of books that you might like to buy on the basis of what you're buying now and what you have bought in the past.
User agents, or personal agents - Intelligent agents that take action on your behalf. In this category belong those intelligent agents that already perform, or will shortly perform, the following tasks:
• Check your e-mail, sort it according to the user's order of preference, and alert you when important emails arrive.
• Play computer games as your opponent or patrol game areas for you.
• Assemble customized news reports for you. There are several versions of these, including newshub and CNN.
• Find information for you on the subject of your choice.
• Fill out forms on the Web automatically for you, storing your information for future reference
• Scan Web pages looking for and highlighting text that constitutes the "important" part of the information there
• "Discuss" topics with you ranging from your deepest fears to sports
• Facilitate with online job search duties by scanning known job boards and sending the resume to opportunities who meet the desired criteria
• Profile synchronization across heterogeneous social networks
Monitoring-and-surveillance (predictive) agents - They are used to observe and report on equipment, usually computer systems. The agents may keep track of company inventory levels, observe competitors' prices and relay them back to the company, watch stock manipulation by insider trading and rumors, etc.
Data mining agents - This agent uses information technology to find trends and patterns in an abundance of information from many different sources. The user can sort through this information in order to find whatever information they are seeking.
4. Computing ethics and bot programming case study: rocky
a. Get an account username and password form the lecturer to LC_MOO at http://ispg.csu.edu.au:7680 and login to the Welcome Lobby.
I log in with the user account 'user19', below is the screenshot of the Welcome Lobby.
b. Hold a 5-minute discussion with Rocky on special topic. Commands and chat are entered in command box (bottom-left of screen in Figure 11) : act rocky (start bot) hush rocky (stop bot)
Figure 11: LC_MOO screen layout with the Rocky bot object.
c. Rocky is an ELIZA-like bot. Report your findings.
Rocky is an ELIZA-like bot. Rocky provided a corresponding response of a pre-defined instruction. You also can create an instruction by LC_MOO command.
I typed in the LC_MOO command and the bot showed 'I don't understand that', I also typed in many words, all responded the same. My conclusion is that is a rubbish, no intelligence but a junk program with poor interface and only be able to answer some pre-defined questions.
References
Hagg. S. (2006). Management Information Systems for the Information Age", pp. 224-228.
Wiki. (2010). Software agent. Retrieved on 14 May 2010 from http://en.wikipedia.org/wiki/Software_agent
Whatis.now. (2008). Spiders. Retrieved on May 14, 2010 from http://whatis.techtarget.com/definition/0,,sid9_gci213035,00.html




No comments:
Post a Comment