AH Artificial Intelligence

Descripción

Mapa Mental sobre AH Artificial Intelligence, creado por John Lawrence el 26/05/2013.
John Lawrence
Mapa Mental por John Lawrence, actualizado hace más de 1 año
John Lawrence
Creado por John Lawrence hace casi 11 años
208
0

Resumen del Recurso

AH Artificial Intelligence
  1. Problem Representation
    1. Abstraction

      Nota:

      • Problem abstraction is the process of defining the problem clearly and unambiguously. This includes defining the initial or start state, defining the goal or end state, and defining the constraints that apply.
      1. Symbolic Representation

        Nota:

        • A way of representing the state space (the space showing all states that are possible by applying legal moves to the initial state) using symbols to represent actual entities. Production Rules (based on legal moves) are used to generate new states: Can be written in English, or in symbolic form as below: old state -- (legal move) --> new state
        1. Trees and Graphs

          Nota:

          • Search Tree is a graphical representation of all states in the state space, starting with the initial state at the top. Usually, repeated state (loop) are not re-expanded again. Search Graph is similar, but repeated states are combined - states are never shown more than once on a Graph. Generally more compact. Graph may have bi-directional arrows and cross branches. AND / OR Graph And branches require both legs to be completed and are joined by an arch. Used in route planning - usually split a route into two halves at the Intermediate point; and then further split up.
        2. Search Strategies
          1. Exhaustive Search

            Nota:

            • Breadth First - each layer explored before moving onto the next layer. Can be implemented by a queue algorithm. Depth First - each branch is explored till its end, then backtracking to next branch. Can be implemented by a stack algorithm. Breadth first will always find the shallowest (optimal) solution; Depth first will find solutions on the left quicker. Breadth first requires more memory as the whole tree must be stored to generate the solution path. Depth first only store current branch.  Loops will not affect Breadth first searches. Combinatorial Explosion an issue with Exhaustive searches.
            1. Heuristics

              Nota:

              • Heuristic is a "rule of thumb" that can be applied to help us evaluate the helpfulness of a node in our pursuit of a goal.  Using heuristics can help "prune" the tree. May not always be accurate or helpful. An evaluation function is usually used to give a cost value of a node.
              1. Hill Climbing

                Nota:

                • From the current state, evaluate the cost of each of the successor nodes.  Choose to expand the successor node that has the best score as long as it is better than the current node's score. Issues: May not find the optimal solution (a poorer route initially might actually get better later on) and it may "plateau" and halt without a solution being found.
                1. Best First Search

                  Nota:

                  • In this algorithm, the best node currently held in memory (the agenda) will be expanded. This might cause the search to jump between different branches of the tree.
                  1. A*

                    Nota:

                    • A* takes into account the evaluation of the node (heuristic function) and the evaluation of the path cost (cost function). These are added to get a total cost. From this it uses the best first algorithm to find its solution.
                    1. MiniMax

                      Nota:

                      • Procedure for a two-player game. Node evaluation functions are either maximised (for the current player) or minimised (for the opponent's shot). Not ideal - in reality mistakes are made and the heuristic function might not be accurate.
                    2. Knowledge Representation
                      1. Semantic Nets

                        Nota:

                        • Used to describe relationships between objects. Each argument (object) is represented once in a circle. Each relationship is shown via a link (arrowed) between the objects. Easily translated into Prolog facts. Inheritance of objects can be shown clearly via is_a and has relationships.
                        1. Frames

                          Nota:

                          • Slots of information built up on each object. A frame may include a Subclass and Instance slot . A slot may have a default value which can be overridden if need by a frame. This is indicated by a * next to the slot. A frame with an Instance slot is an actual living object.
                          1. Converting Between Frames and Semantic Nets

                            Nota:

                            • Semantic Net --> Frame Objects that are linked by "Subclass" and "Instance" relationships are the Frame Names. The other relationships form the slot names in each of the frames. The objects provide the values for these slots. Frame --> Semantic Net The Names of the frames and the slot values form the object circles for the semantic net. The relationships are taken from each of the slot names.
                            1. Prolog

                              Nota:

                              • Declarative language with in-built depth first search strategy.
                              1. Recursion

                                Nota:

                                • Recursive rules call themselves. They usually need a fact subgoal to come first, then the recursive subgoal. All recursive rules need a non-recursive version to come first otherwise they will loop forever. Recursion cuts down on code, making the code more efficient.
                                1. Inheritance - from Semantic Nets

                                  Nota:

                                  • Semantic nets will show inheritance, but not explicitly.  Changing a semantic net into Prolog only created facts.Rules need to be created to show the inheritance between subclass and instances. These are found as follows: Subclass relationships: Choose a superclass object in semantic net (an object that has a subclass relationship pointing to it). Find a relationship off this superclass object. Then this_relationship(X, Y) :-  subclass(X, Z),                                               this_relationship(Z, Y). Instance relationships: Choose an object that has an instance relationship (or is_a predicate) pointing to it. Find a relationship off this object. Then this_relationship(X, Y) :-  instance(X, Z),                                                     this_relationship(Z, Y).
                                  1. Lists

                                    Nota:

                                    • [item1, item2, item3,  ....][H|T]    head is item1Tail is  [item2, item3, ...]Recursion works by extracting the head and then working the rule recursively on the tail. If non-recursive version matches the item on the head.Membership:member(X, X|Tail).  member(X,[Head|Tail]) :- member(X,Tail).Concatenationconcat([],L,L)concat([Head|Tail],L2,[Head|Tail2] :- concat(Tail, L2, Tail2). 
                                2. Rule Based Systems
                                  1. Expert Systems

                                    Nota:

                                    • Consists of user interface, inference engine, knowledge base and working memory. Take out the knowledge base then you get the expert system shell. Created via: 1) Knowledge acquisition  via domain expert, knowledge engineer and domain user.Can be difficult - vague answers, intuition, lack of support for change by expert. 2) Knowledge Representation (Prolog, IF THEN rules etc) 3) System Validation (matching response against the expert) HOW and WHY justification - How did you get that solution (looks back);  Why did you ask that question (looks forward).
                                    1. Forward and Backward Rules

                                      Nota:

                                      • Forward IF conditions THEN conclusion Backward conclusion IF conditions AND/OR tree can also be used to show these different syntax, same semantics example.Backward ChainingThis mechanism starts with a conclusion and attempts to satisfy the conditions.Goal / hypothesis driven, might follow unhelpful pathsForward ChainingThis method starts with what is known and attempts to fire rules to establish more known facts. Continues until desired fact is found.Good for gathering information / monitoring, might require a lot of user input or generate unhelpful facts.Often many rules could be fired - the Conflict Set.Conflict Resolution Strategies involve:Most specific (most conditions)First Come First Served (order)Most Recent (added most recently)PrioritiesAvoid Repetition
                                      1. Certainty Factors

                                        Nota:

                                        • Provides a degree of confidence to an answer or a conclusion. Can be ad-hoc in their assignment. CFs need to be combined to provide a final CF: CFconclusion = MIN(CFcond1 ,CFcond2 ,CFcond3 ...) x CFrule
                                      2. Vision

                                        Nota:

                                        • Five stages: Digitisation, signal processing (enhancing), edge & region detections - form the primal sketch, object recognition (AI), image understanding (AI). Waltz algorithm on 2D primal sketch, trihedral shapes (3 faces to a common point).  Convex edges (+) they stick out; Concave (-) stick in; > is a n obscuring edge. Start by marking out all outer edges with >. Apply rules (18 possible) to work out + or - for the rest.  Rule of Thumb: stick out bits are +, suck in bits are -. Application: quality control, robot/weapons guidance, xray screening, weather satellite understanding.
                                        1. Language

                                          Nota:

                                          • 4 main stages: Speech Recognition, Natural Language Understanding, Natural Language Generation, Speech Synthesis Recognition: finding phonemes (sounds) to split words; issues cheque check czech ( lexical ambiguity) and background noise. Understanding: Syntactic analysis: make sure structure is correct; Semantic analysis: extract meaning; Pragmatic analysis: use context to help with meaning. Ambiguity: Interpretations (I saw the man with glasses), Imprecise (mouse was not working); Changing meanings (Cool, Guys etc). Parse Tree: Sentence: noun phrase, verb phrase Noun Phrase:  1) Proper Noun 2) ProNoun 3) Det, Noun 4) Det, Adj, Noun Verb Phrase: 1) Verb 2) verb, noun phrase Sentences tackled via depth first search, with backtracking.
                                          1. Robotics

                                            Nota:

                                            • Blocks world: Operators ontable(x), on(x,y) and clear(a). move(...) to do transitions. Pre-conditions, Add Fact and Remove Fact can be included for transition operations. Planning involves finding the moves that will cause the initial state >> goal state. (use search techniques). Usual issues (combinatorial explosion, heuristic functions) sometimes means working from the goal state back to the initial state is better (means-ends analysis). Global (whole strategy) and Local (specific, small scale strategy) used when planning.
                                            1. Machine Learning

                                              Nota:

                                              • 7 types of learning: rote learning - memorise everything!learning from advice - difficult to transfer to machines as it is ill defined.learning from experience - Samuel's Checkers program, genetic algorithmslearning from examples (inductive learning): e.g. neural nets, provided with test cases.explanation-based learning: Uses example and domain knowledge (e.g. Soar system).learning by discovery: pick up facts by trying things (very human like, but some projects e.g. CYC.)learning by analogy: Reacting by considering similar situations. E.g. Evans IQ test.
                                              Mostrar resumen completo Ocultar resumen completo

                                              Similar

                                              Teorema de Pitágoras
                                              josefa.h.122
                                              Phrasal verbs
                                              John Goalkeeper
                                              Los últimos Premios Nobel de Literatura
                                              luisalenes
                                              Ecuaciones diferenciales lineales de segundo orden
                                              Juan Beltran
                                              Fichas de Historia de España
                                              stefany1994
                                              VERB TENSES
                                              cesartorres721
                                              LA REPRODUCCION EN LOS ANIMALES
                                              Liliam Beatriz Meneses Quintero
                                              Tecnologías de la información y la comunicación
                                              Jocelyn Glover
                                              Anatomía Animal (Sistema Óseo)
                                              Ana Favela
                                              ¿Conozco las herramientas de diseño AutoCAD?
                                              Sonia Rojas Barbosa
                                              MAPA CONCEPTUAL CAP 1 GERENCIA ESTRATEGICA
                                              DAVID ARCE