Path (graph theory) in the context of Optimal substructure


Path (graph theory) in the context of Optimal substructure

Path (graph theory) Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Path (graph theory) in the context of "Optimal substructure"


โญ Core Definition: Path (graph theory)

In graph theory, a path in a graph is a finite or infinite sequence of edges which joins a sequence of vertices which, by most definitions, are all distinct (and since the vertices are distinct, so are the edges). A directed path (sometimes called dipath) in a directed graph is a finite or infinite sequence of edges which joins a sequence of distinct vertices, but with the added restriction that the edges be all directed in the same direction.

Paths are fundamental concepts of graph theory, described in the introductory sections of most graph theory texts. See e.g. Bondy & Murty (1976), Gibbons (1985), or Diestel (2005). Korte et al. (1990) cover more advanced algorithmic topics concerning paths in graphs.

โ†“ Menu
HINT:

๐Ÿ‘‰ Path (graph theory) in the context of Optimal substructure

In computer science, a problem is said to have optimal substructure if an optimal solution can be constructed from optimal solutions of its subproblems. This property is used to determine the usefulness of greedy algorithms for a problem.

Typically, a greedy algorithm is used to solve a problem with optimal substructure if it can be proven by induction that this is optimal at each step. Otherwise, provided the problem exhibits overlapping subproblems as well, divide-and-conquer methods or dynamic programming may be used. If there are no appropriate greedy algorithms and the problem fails to exhibit overlapping subproblems, often a lengthy but straightforward search of the solution space is the best alternative.

โ†“ Explore More Topics
In this Dossier

Path (graph theory) in the context of Hierarchical

A hierarchy (from Greek: แผฑฮตฯฮฑฯฯ‡ฮฏฮฑ, hierarkhia, 'rule of a high priest', from hierarkhes, 'president of sacred rites') is an arrangement of items (objects, names, values, categories, etc.) that are represented as being "above", "below", or "at the same level as" one another. Hierarchy is an important concept in a wide variety of fields, such as architecture, philosophy, design, mathematics, computer science, organizational theory, systems theory, systematic biology, and the social sciences (especially political science).

A hierarchy can link entities either directly or indirectly, and either vertically or diagonally. The only direct links in a hierarchy are to one's immediate superior or subordinate. Hierarchical links can extend "vertically" upwards or downwards via multiple links in the same direction, following a path. All parts of the hierarchy that are not linked vertically to one another can also be "horizontally" linked through a path by traveling up the hierarchy to find a common direct or indirect superior, and then down again. This is a system of co-workers or colleagues; each reports to a common superior, but they have the same relative amount of authority. Organizational forms exist that are both alternative and complementary to hierarchy. Heterarchy is one such form.

View the full Wikipedia page for Hierarchical
โ†‘ Return to Menu

Path (graph theory) in the context of Tree (graph theory)

A directed tree, oriented tree, polytree, or singly connected network is a directed acyclic graph (DAG) whose underlying undirected graph is a tree. A polyforest (or directed forest or oriented forest) is a directed acyclic graph whose underlying undirected graph is a forest.

View the full Wikipedia page for Tree (graph theory)
โ†‘ Return to Menu

Path (graph theory) in the context of Hamiltonian path

In the mathematical field of graph theory, a Hamiltonian path (or traceable path) is a path in an undirected or directed graph that visits each vertex exactly once. A Hamiltonian cycle (or Hamiltonian circuit) is a cycle that visits each vertex exactly once. A Hamiltonian path that starts and ends at adjacent vertices can be completed by adding one more edge to form a Hamiltonian cycle, and removing any edge from a Hamiltonian cycle produces a Hamiltonian path. The computational problems of determining whether such paths and cycles exist in graphs are NP-complete; see Hamiltonian path problem for details.

Hamiltonian paths and cycles are named after William Rowan Hamilton, who invented the icosian game, now also known as Hamilton's puzzle, which involves finding a Hamiltonian cycle in the edge graph of the dodecahedron. Hamilton solved this problem using the icosian calculus, an algebraic structure based on roots of unity with many similarities to the quaternions (also invented by Hamilton). This solution does not generalize to arbitrary graphs.

View the full Wikipedia page for Hamiltonian path
โ†‘ Return to Menu

Path (graph theory) in the context of Distance (graph theory)

In the mathematical field of graph theory, the distance between two vertices in a graph is the number of edges in a shortest path (also called a graph geodesic) connecting them. This is also known as the geodesic distance or shortest-path distance. Notice that there may be more than one shortest path between two vertices. If there is no path connecting the two vertices, i.e., if they belong to different connected components, then conventionally the distance is defined as infinite.

In the case of a directed graph the distance d(u,v) between two vertices u and v is defined as the length of a shortest directed path from u to v consisting of arcs, provided at least one such path exists. Notice that, in contrast with the case of undirected graphs, d(u,v) does not necessarily coincide with d(v,u)โ€”so it is just a quasi-metric, and it might be the case that one is defined while the other is not.

View the full Wikipedia page for Distance (graph theory)
โ†‘ Return to Menu

Path (graph theory) in the context of Transitive reduction

In the mathematical field of graph theory, a transitive reduction of a directed graph D is another directed graph with the same vertices and as few edges as possible, such that for all pairs of vertices v, w a (directed) path from v to w in D exists if and only if such a path exists in the reduction. Transitive reductions were introduced by Aho, Garey & Ullman (1972), who provided tight bounds on the computational complexity of constructing them.

More technically, the reduction is a directed graph that has the same reachability relation as D. Equivalently, D and its transitive reduction should have the same transitive closure as each other, and the transitive reduction of D should have as few edges as possible among all graphs with that property.

View the full Wikipedia page for Transitive reduction
โ†‘ Return to Menu

Path (graph theory) in the context of Shortest path problem

In graph theory, the shortest path problem is the problem of finding a path between two vertices (or nodes) in a graph such that the sum of the weights of its constituent edges is minimized.

The problem of finding the shortest path between two intersections on a road map may be modeled as a special case of the shortest path problem in graphs, where the vertices correspond to intersections and the edges correspond to road segments, each weighted by the length or distance of each segment.

View the full Wikipedia page for Shortest path problem
โ†‘ Return to Menu

Path (graph theory) in the context of Intelligence cycle

The intelligence cycle is an idealized model of how intelligence is processed in civilian and military intelligence agencies, and law enforcement organizations. It is a closed path consisting of repeating nodes, which (if followed) will result in finished intelligence. The stages of the intelligence cycle include the issuance of requirements by decision makers, collection, processing, analysis, and publication (i.e., dissemination) of intelligence. The circuit is completed when decision makers provide feedback and revised requirements. The intelligence cycle is also called intelligence process by the U.S. Department of Defense (DoD) and the uniformed services.

View the full Wikipedia page for Intelligence cycle
โ†‘ Return to Menu

Path (graph theory) in the context of Cycle (graph theory)

In graph theory, a cycle in a graph is a non-empty trail in which only the first and last vertices are equal. A directed cycle in a directed graph is a non-empty directed trail in which only the first and last vertices are equal.

A graph without cycles is called an acyclic graph. A directed graph without directed cycles is called a directed acyclic graph. A connected graph without cycles is called a tree.

View the full Wikipedia page for Cycle (graph theory)
โ†‘ Return to Menu

Path (graph theory) in the context of Reachability

In graph theory, reachability refers to the ability to get from one vertex to another within a graph. A vertex can reach a vertex (and is reachable from ) if there exists a sequence of adjacent vertices (i.e. a walk) which starts with and ends with .

In an undirected graph, reachability between all pairs of vertices can be determined by identifying the connected components of the graph. Any pair of vertices in such a graph can reach each other if and only if they belong to the same connected component; therefore, in such a graph, reachability is symmetric ( reaches iff reaches ). The connected components of an undirected graph can be identified in linear time. The remainder of this article focuses on the more difficult problem of determining pairwise reachability in a directed graph (which, incidentally, need not be symmetric).

View the full Wikipedia page for Reachability
โ†‘ Return to Menu