US5115399A - Position input system for vehicular navigation apparatus - Google Patents

Position input system for vehicular navigation apparatus Download PDF

Info

Publication number
US5115399A
US5115399A US07/618,021 US61802190A US5115399A US 5115399 A US5115399 A US 5115399A US 61802190 A US61802190 A US 61802190A US 5115399 A US5115399 A US 5115399A
Authority
US
United States
Prior art keywords
intersection
information
desired destination
destinations
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US07/618,021
Inventor
Mitsuhiro Nimura
Shoji Yokoyama
Takashi Yamada
Koji Sumiya
Shuzo Moroto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aisin AW Co Ltd
Shinsangyo Kaihatsu KK
Original Assignee
Aisin AW Co Ltd
Shinsangyo Kaihatsu KK
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisin AW Co Ltd, Shinsangyo Kaihatsu KK filed Critical Aisin AW Co Ltd
Application granted granted Critical
Publication of US5115399A publication Critical patent/US5115399A/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers

Definitions

  • This invention relates to a vehicular navigation apparatus which provides guidance along a path by outputting guidance information for travel to a desired destination.
  • a navigation apparatus for automotive vehicles is adapted to provide course guidance for travel to a desired destination to a driver who is unfamiliar with the local geography. Recent years have seen extensive development of such navigation apparatus.
  • a vehicular navigation apparatus relies upon so-called "route matching" in which a course is set from a starting point to a desired destination before the vehicle begins travelling, with course guidance being provided to the driver in accordance with the course set.
  • a map is displayed on the screen of a CRT and the course is superimposed on the map if the driver designates a specific course.
  • the navigation apparatus designates an intersection at which a turn is to be made next in accordance with the preset course, the distance to this intersection is displayed numerically or in the form of a graph.
  • the driver When a turn is to be made at an intersection using such a navigation apparatus, the driver observes the course displayed on the map to decide the next intersection at which the turn is to be made, or the driver looks at the numeric or graph display to ascertain the distance to the intersection where the turn is to be made, thereby determining the proper intersection.
  • the conventional navigation apparatus is such that a course is set from a starting point to a desired destination before the vehicle begins travelling and course guidance is provided to the driver in accordance with the course set. Consequently, if the driver should happen to mistake an intersection and depart from the set course, travel in accordance with the guidance provided by the navigation apparatus will not be able to continue unless the vehicle is returned to the set course.
  • a decision as to whether or not a predetermined intersection has been passed as specified by the course guidance is based upon detection of travelled distance or a left or right turn as detected by a distance sensor or steering angle sensor, respectively. In actuality, however, detection of travelled distance and steering angle is susceptible to considerable error, which can cause the navigation apparatus to make errors in judgment.
  • Japanese Patent Application Laid-Open (KOKAI) No. 62-51000 proposes a present position input system, which involves inputting the vehicle's present position on a map and displaying a list of place names. Inputting present position makes it possible to perform route matching in a continuous manner.
  • an object of the present invention is to provide a position input system for a vehicular navigation apparatus in which position inputs can be made simply and correctly by obtaining a map in the form of intersections and geographical points located between intersections and possessing useful guidance information.
  • Another object of the present invention is to provide a position input system, which is particularly effective when applied to a navigation system in which when the coordinates of a plurality of geographical points (e.g., intersections, characterizing structures, etc.) are set and a desired destination is inputted, a course for travel to the destination is sought at each geographical point and outputted as guidance information.
  • a position input system which is particularly effective when applied to a navigation system in which when the coordinates of a plurality of geographical points (e.g., intersections, characterizing structures, etc.) are set and a desired destination is inputted, a course for travel to the destination is sought at each geographical point and outputted as guidance information.
  • a position input system for a vehicular navigation apparatus in which guidance information for travel to a desired destination is outputted upon inputting the desired destination, comprising an input unit for inputting a desired destination and a present position, a memory unit for storing desired destination information and map information, and a display unit for outputting the desired destination information and map information in response to the input information from the input unit, wherein the map information is information indicative of intersections and a geographical point located between intersections and possessing useful guidance information, wherein a desired destination and present position are capable of being inputted by setting the useful guidance information between a road leading to and a road leading from the geographical point.
  • a map is not obtained merely in the form of intersections but also includes geographical points, shown in the form of nodes, having useful guidance information (e.g., bridges, rivers, buildings and gasoline stations) at points between intersections.
  • nodes indicate point data representing map coordinates
  • Arcs indicate line data and represent portions of the roads.
  • a landmark serving as useful guidance information can be provided as data between roads (i.e., between two arcs) connecting node numbers on either side of a node number n of a particular intersection.
  • obtaining a map as intersections and information relating to a geographical point located between intersections and possessing useful information makes it possible to input positions simply and accurately. More specifically, positions are precise since they are selected from pre-stored information relating to geographical locations. Vehicle direction can also be inputted in accurate fashion where parking lot exits and intersections are concerned.
  • display is made by class to facilitate retrieval. Even if a display of a list of place names is limited to place names which appear with comparative frequency in order to reduce the place names displayed, it is possible to input all geographical point information by code number. Furthermore, it is possible to play a voice track or display a message when operating the apparatus and coordinate selectable items on a display screen by using the same color to identify them, thereby eliminating the chance of error.
  • the invention when the invention is applied to a navigation system in which the coodinates of a plurality of geographical points (e.g., intersections, characterizing structures, etc.) are set and a desired destination is entered, whereupon a course is sought to the desired destination at each geographical point and outputted as guidance information, the advantage of simplified position input is accompanied by the ability to obtain guidance information from any geographical point whatsoever after a desired destination is designated. Furthermore, after the present position is specified and entered, merely inputting a trigger signal makes it possible to readily set the next geographical point as a guidance point in accordance with the guidance information for travel to the desired destination.
  • geographical points e.g., intersections, characterizing structures, etc.
  • FIG. 1 is a block diagram illustrating the construction of an embodiment of a navigation apparatus according to the present invention
  • FIG. 2 is a diagram illustrating one example of a route sequence
  • FIG. 3 is a flowchart of navigation processing according to the invention.
  • FIGS. 4(a) and 4(b) are useful in describing course exploration processing
  • FIGS. 5(a) through 9(b) are views useful in describing the structure of data according to the invention.
  • FIG. 10 is a flowchart illustrating an example of a desired destination input method
  • FIGS. 11(a) to 11(g) illustrate examples of screens displayed in accordance with the method of FIG. 10;
  • FIGS. 12 and 13 are flowcharts illustrating an example of a present position input method.
  • FIGS. 14(a) to 14(e) illustrate examples of screens displayed in accordance with the method of FIGS. 12 and 13.
  • a navigation apparatus comprises input means 1, a CPU 2, a display unit 3 such as a CRT or liquid crystal display panel, and a memory unit 4.
  • the input unit 1 includes a keyboard 5 comprising a ten-key pad and function keys for inputting code numbers of predetermined geographical locations, such as a desired destination and present position (guidance location).
  • a touch panel 6, light pen 7, mouse 8 or voice input means 9 may be employed instead of the keyboard 5.
  • the memory unit 4 is a memory such as a CD-ROM in which network data indicative of geographical points, namely desired destination and present position, and other information are stored in advance.
  • map data 10 a list 11 of intersections, a list 12 of desired destinations, road data 13 and a list 14 of regions are stored.
  • the CPU 2 sets information for travel to the desired destination, by a method such as course exploration, in accordance with each geographical point stored in the memory unit 4, and stores this information in a memory such as a RAM.
  • the display unit 3 outputs guidance information for this point. If only intersections serve as geographical points, the outputted guidance information is that for travel to the next intersection, such as an indication of a left or right turn, at the intersection serving as the guidance point.
  • the outputted guidance information can include the direction of the first turn along with information designating the proper lane to take after the turn, as well as the direction of the second turn and the associated guidance information.
  • the display unit can output guidance information relating to a course leading to a desired destination in accordance with the path sequence a, b, c, . . . shown in FIG. 2.
  • a course exploration mode is established in which information for travel to the desired destination is set for all geographical points with the exception of the entered desired destination (step 2).
  • a present position input mode is established, in which the driver inputs the code of his present position (step 3). When this is done, the proper direction of forward travel from this position is outputted (step 4).
  • an intersection verification trigger i.e., when a start input is made
  • information for travel to a destination which is the next intersection is outputted (step 6).
  • step (7) at which monitoring is performed to see whether the intersection verification trigger or a signal from a present-position input button has been entered. If the present position input button has been pressed, the program returns to the processing of step (3).
  • a trigger is inputted each time an intersection is verified providing that the vehicle is travelling as per instructions. If the vehicle strays from the instructed course and the driver notices this only after the vehicle has travelled to another intersection, the present position input button is pressed. Accordingly, whenever a trigger is inputted, guidance information relating to an intersection on a route leading to the desired destination is outputted in sequential fashion.
  • the present position input button is pressed, the present position input mode is established.
  • step (2) The course exploration processing of step (2) will now be described with reference to FIGS. 4(a) and (b).
  • the couse exploration input mode is established, as shown in FIG. 4(a), first the desired destination is set in a work area at step (11), after which forward directions from intersections near the destination are set at step (12).
  • the set forward directions include forward directions d 1 at intersections before the destination, and forward directions d 2 at intersections before the first-mentioned intersections. It is permissible to execute this course exploration after the processing of step (3) in FIG. 3, in which case course exploration would be performed whenever present position is inputted.
  • FIGS. 5 through 9 illustrate the structure of data in accordance with the invention.
  • FIG. 5 is a diagram useful in describing the fundamental approach adopted in forming map data.
  • a map is not construed merely an intersections but also includes nodes, which are points having useful guidance information (e.g., bridges, rivers, buildings, gasoline stations, etc.) at geographical points between intersections.
  • nodes indicate point data representing map coordinates
  • Arcs indicate line data and represent portions of the roads.
  • a landmark serving as useful guidance information can be provided as data between roads (i.e., between two arcs) connecting node numbers on either side of a node number n of an intersection depicted in FIG. 5(b).
  • FIG. 6 illustrates node series data. What is stored are east longitude and north latitude, namely the coordinates of the pertinent geographical point, for each node number, as well as the attribute which distinguishes the effective guidance information (e.g., bridges, rivers, buildings, gasoline stations) as data for each node number.
  • east longitude and north latitude namely the coordinates of the pertinent geographical point, for each node number, as well as the attribute which distinguishes the effective guidance information (e.g., bridges, rivers, buildings, gasoline stations) as data for each node number.
  • FIG. 7 illustrates an example of an intersection list, in which there are stored the code numbers of the intersections, the intersection names, the intersection numbers (numbers assigned only to those of the nodes that are intersections), the node numbers of two connecting nodes, as described above with reference to FIG. 5(b), the names of landmarks and attributes.
  • FIG. 8 illustrates an example of a desired destination list, in which there are stored code numbers, the names of desired destinations, parking lot numbers, the numbers of two connecting intersections connecting a desired destination, the directions of parking lots (whether a parking lot is on the left or right side of a road or straight ahead), the numbers of photographs of connecting intersections, the numbers of photographs of parking lot exits, block data for each region, and coordinates (east longitude, west latitude).
  • the arrangement is such that the attribute of each desired destination is distinguishable by class.
  • the following numbers can be assigned to the most significant bits of code numbers to indicate class: 0 (sightseeing), 1 (public facility), 2 (lodgings), 3 (dining), 4 (place of business), 5 (gasoline station), 6 (intersection), 7 (parking lot), 8 (souvenirs), and other attribute data can be provided if desired.
  • These desired destination data indicate parking areas near the desired destinations. If a desired destination is a parking lot, the driver is informed of the connecting intersection numbers, the direction of the parking lot (whether it is on the left or right side of a road or straight ahead), the photograph numbers of the connecting intersections and the photograph numbers of the parking lot exit. Thus, the driver is guided in positive fashion until the vehicle arrives at its final destination.
  • FIG. 9(a) illustrates an example of road data.
  • each road is assigned a road number(s) along with the direction(s) of traffic flow.
  • the stored road data include, for each road number, the node numbers of starting and end points of the road, the number of a road having the same starting point, the number of a road having the same end point, road width, information relating to prohibitions, information relating to guidance not required (as when the driver need only continue travelling straight ahead), photograph numbers, the numbers of nodes, the leading addresses of node series data, length, etc.
  • FIG. 10 is a flowchart of the associated processing.
  • a step 101 calls for the screen shown in FIG. 11(a) to be displayed as a desired destination input. This screen is for indicating the different classes which can be displayed in selecting the destination. Items which appear frequently, such as "SIGHTSEEING”, “LODGINGS”, “DINING”, “SOUVENIRS”, “CODE NO. INPUT”, "RETURN” are displayed in red as selectable items, and a desired class is selected by touch-panel input at step 102.
  • step 103 it is determined at step 103 whether "RETURN" on the screen of FIG. 11(a) has been inputted. If the answer is NO, it is determined at step 106 whether "CODE NO.” has been inputted. If the answer received here is YES, then the program proceeds to step 112.
  • step 106 the program proceeds to step 107, at which a list of parking lots (destinations) for the selected item are read in from the CD-ROM, whereupon the screen shown in FIG. 11(b) is displayed at step 108.
  • a voice track "SELECT YOUR DESIRED DESTINATION” is played.
  • items are displayed in the order of their popularity.
  • PREVIOUS PAGE or "NEXT PAGE” key, parking lots can be brought to the screen and a desired parking lot can be selected. All of the input display sections are displayed in the color red. The last item in the display is the "CODE INPUT NO. INPUT" item. If "PREVIOUS PAGE" is entered at the first page, the program returns to step 101.
  • a confirmation screen shown in FIG. 11(c) is displayed at step 110.
  • the selected item is backlighted in, say, the color blue, while the other items appear in dark blue, so that the driver may easily confirm the selection made.
  • OK is pressed, the program returns to step 108. If OK is pressed, it is determined at step 111 whether the name of a parking lot has been inputted or a code number. If the name of a parking lot is the desired destination, the program proceeds to step 119, where data corresponding to the name of the parking lot are read from the CD-ROM and set in the memory area of the CPU.
  • step 112 When a change is made in the code number input at step 111, or when code number input is selected at step 106, a code number input screen shown in FIG. 11(d) is displayed at step 112, after which a desired parking lot code number is inputted from the touch panel at step 113. It is then determined at a step 114 whether the code number designation is erroneous. If it is, step 115 calls for display of a message reading "CODE NO. DESIGNATION IS INCORRECT" and the program returns to step 113. If the code number designation is correct, then it is determined at step 116 whether the code number is an intersection code. If it is not an intersection code, the program proceeds to step 118, at which the desired destination is displayed automatically, as shown in FIG. 11(e). If the "OK" key is pressed, the program proceeds to step 119, at which data corresponding to parking lot name are read out of the CD-ROM and set in the memory area of the CPU. The program returns to step 113 if "CANCEL” is pressed.
  • step 116 If the code number designated at step 116 is indicative of an intersection, a message reading "INTERSECTION CODE NO. CANNOT BE ENTERED", which is shown in FIG. 11(f), is displayed at step 117 and the program returns to step 113.
  • "RETURN” is inputted at step 101 in execution of the above routine, the screen of FIG. 11(g) is displayed through steps 103, 104. If the driver presses "OK”, the initial departure point data are copied in the desired destination storage area at step 105.
  • FIG. 12 is a flowchart of processing for setting the position of an automotive vehicle.
  • Step 131 of the flowchart calls for display of a message, shown in FIG. 14(a), requesting entry of the name of an intersection.
  • the driver continues driving straight ahead until an intersection having a name is passed, whereupon the driver immediately stops the vehicle and enters the intersection number (step 132) while referring to an instruction manual.
  • a code number input screen shown in FIG. 14(b) is displayed at step 133 and the code number is inputted by the touch panel at step 134, in response to which the name of the intersection shown in FIG. 14(c) is displayed at step 135. If the entered code number is incorrect, a message to this effect will be displayed at this time.
  • step 136 it is determined at step 136 whether the name of the intersection is "OK" or is to be cancelled. If "CANCEL” is pressed, the program returns to step 133. If "OK” is pressed, processing for displaying a vehicle position input screen is executed at step 137 and the screen shown in FIG. 14(d) is displayed.
  • the node data inputted by code number is read from the map data and the shape of the intersection is displayed based on the data indicative of the arcs connected to this intersection.
  • the numbers of the roads leading to the intersection are displayed on the roads so that the road number can be entered from the ten-key pad. As shown in FIG. 14(d), only those keys corresponding to the intersection road numbers are displayed in, for example, the color red.
  • the location of a landmark is displayed at the position of a line segment bisecting the angle formed by two arcs, and the name of the landmark is displayed as well.
  • Step 138 When the driver enters the number of the road on which the vehicle is presently located while referring to the location of the landmark, a screen such as that shown in FIG. 14(e) is displayed at step 138. This screen calls for the driver to confirm the number of the road on which the vehicle is presently located. Step 139 calls for the driver to press "OK” or "CANCEL”. The program returns to step 137 if "CANCEL” is pressed. If "OK” is pressed, a location that is a predetermined distance (e.g., 70 m) from the starting point node on the designated arc is set as the vehicle position at step 140.
  • a predetermined distance e.g. 70 m
  • FIG. 13 is a flowchart of processing for displaying the vehicle position input screen of step 137.
  • intersection number C o is specified by inputting the name of the intersection using a code number.
  • roads leading to the intersection designated as a starting point namely roads which include the designated intersection, are selected from the arc and road data (FIG. 9).
  • the node series data of FIG. 6 are transformed from map coordinates to screen coordinates and displayed. This will now be described in greater detail.
  • intersection number (2) is designated by designating its intersection code number.
  • roads which include intersection number (2) are retrieved from the starting point data contained in the road data (the loop of steps 202, 203, 204, 215).
  • road number 2 whose starting point is 2 is found, and the leading address 200 of the node series data representative of this road can be read out of memory (step 205).
  • the node series data represent a sequence of nodes which include intersections located between intersections, and the east longitude and north latitude of intersection number (2) are the initial data of the node sequence from address 200.
  • the east longitude and north latitude are stored as EX o , EY o , and from an enlargement coefficient a we have
  • step 210 it is possible to transform the coordinates of intersection (2) to the center of the screen where the screen coordinates are (0,0), and a starting point X o , Y o for painting a straight line on the screen is made 0 at step 211.
  • Subsequent processing involves sequentially reading out node series data (step 207), effecting the transformation into screen coordinates, storing the results as X 1 , Y 1 and painting straight lines on the screen based on the screen coordinates of the immediately preceding node (steps 212, 214).
  • a road is painted from intersection number (2) to intersection number (1).
  • the painting of one road is assumed to end (step 213) when X 1 , Y 1 calculated at step 212 are outside a predetermined screen coordinate area.
  • a road having (2) as its starting point number is again retrieved from the road data of FIG. 9 and road number (3) is found (steps 202, 203).
  • the node series data are then read out of memory and painting of a road from intersection number (2) is again carried out.
  • the initial data at address 300 are the same as EX o , EY o read out previously, X and Y are made 0 (steps 217, 211).
  • road numbers 2, 3 are painted from intersection number (2) to each of the other intersections within the limits of display screen. This ends processing for displaying the shape of the intersection.
  • processing for the display of the landmark position is performed starting from step 219 on the basis of the intersection list data, road data and node series data.
  • the intersection list data two intersection numbers C o1 , C o2 are indicated as connecting intersections. Assuming that these are intersections (3), (4) in FIG. 9, the landmark is displayed at the angle formed by intersections (2), (3), (4).
  • the roads which include the intersection (3) as starting point are retrieved from the road data, whereby road number 4 is found.
  • East-longitude and north-latitude data contained in the node series data are read from the node series data address of this road, these data are stored as EX 1 , EY 1 , and a flag f 1 indicating that the coordinates of the first connecting intersection have been set is made 1 (steps 223-225).
  • east-longitude and north-latitude data for intersection number (4) are stored as EX 2 , EY 2 .
  • coordinate data for the intersections (2), (3) and (4) are stored as (EX o ,EY o ), (EX 1 ,EY 1 ) and (EX 2 ,EY 2 ), respectively.
  • a line segment which bisects the angle formed by these intersections is obtained from these coordinates (step 232), the coordinates of a point a predetermined distance from (EX o ,EY o ) are made (X o ,Y o ) (step 233), and these coordinates are transformed into screen coordinates by the transformation equations obtained at step 210. As a result, the landmark is displayed.
  • destinations and intersections are inputted by entering code numbers
  • data indicative of Japanese kana characters or Roman letters can be provided as the destination and intersection data and these can be inputted by character retrieval.
  • the navigation apparatus identifies predetermined geographical points automatically by other means and changes over the guidance information delivered to the driver each time.
  • each geographical point can be provided with information indicating whether the point is on a route of the desired category (i.e., back road or principal road).
  • a desired course such as a route along back roads or a route along a principal road.

Abstract

A position input system for a vehicular navigation apparatus includes an input unit for inputting a desired destination and a present position, a memory unit for storing desired destination information and map information, and a display unit for outputting the desired destination information and map information in response to input information from the input unit. The map information is indicative of intersections and a geographical point located between intersections and possessing useful guidance information. A desired destination and present position are capable of being inputted by setting the useful guidance information between a road leading to and a road leading from the geographical point. This makes it possible to enter position simply and accurately. Positions are precise since they are selected from pre-stored information relating to geographical locations. Vehicle direction can also be inputted in accurate fashion where parking lot exits and intersections are concerned.

Description

This application is a continuation of application Ser. No. 290,202, filed Dec. 27, 1988, now abandoned.
BACKGROUND OF THE INVENTION
This invention relates to a vehicular navigation apparatus which provides guidance along a path by outputting guidance information for travel to a desired destination.
A navigation apparatus for automotive vehicles is adapted to provide course guidance for travel to a desired destination to a driver who is unfamiliar with the local geography. Recent years have seen extensive development of such navigation apparatus.
Conventionally, a vehicular navigation apparatus relies upon so-called "route matching" in which a course is set from a starting point to a desired destination before the vehicle begins travelling, with course guidance being provided to the driver in accordance with the course set. In some of these apparatus, a map is displayed on the screen of a CRT and the course is superimposed on the map if the driver designates a specific course. In a case where the navigation apparatus designates an intersection at which a turn is to be made next in accordance with the preset course, the distance to this intersection is displayed numerically or in the form of a graph. When a turn is to be made at an intersection using such a navigation apparatus, the driver observes the course displayed on the map to decide the next intersection at which the turn is to be made, or the driver looks at the numeric or graph display to ascertain the distance to the intersection where the turn is to be made, thereby determining the proper intersection.
However, as mentioned above, the conventional navigation apparatus is such that a course is set from a starting point to a desired destination before the vehicle begins travelling and course guidance is provided to the driver in accordance with the course set. Consequently, if the driver should happen to mistake an intersection and depart from the set course, travel in accordance with the guidance provided by the navigation apparatus will not be able to continue unless the vehicle is returned to the set course. In addition, a decision as to whether or not a predetermined intersection has been passed as specified by the course guidance is based upon detection of travelled distance or a left or right turn as detected by a distance sensor or steering angle sensor, respectively. In actuality, however, detection of travelled distance and steering angle is susceptible to considerable error, which can cause the navigation apparatus to make errors in judgment.
In an effort to solve these problems, Japanese Patent Application Laid-Open (KOKAI) No. 62-51000 proposes a present position input system, which involves inputting the vehicle's present position on a map and displaying a list of place names. Inputting present position makes it possible to perform route matching in a continuous manner.
However, when teaching the navigation apparatus the present position in the system proposed in Japanese Patent Application Laid-Open (KOKAI) No. 62-51000, it is required that the present position on a road map appearing on a display screen be displayed in enlarged form by pressing, with the tip of one's finger, a touch panel on which the display image is superimposed. This makes the apparatus inconvenient to operate. Another problem is that present position cannot be taught correctly from the relationship between the display and the resolution of the touch panel. If the list of place names is displayed so that present position can be selected from the list, it will be necessary for the driver to turn a large number of pages on the display screen if there are many candidates for selection. Conversely, the system will lack comprehensiveness if too few candidates are made available.
The applicant has filed a patent application (U.S. Ser. No. 260,213, filed Oct. 20, 1988) proposing a novel navigation apparatus which relies upon an explorer system instead of the above-described route matching system. In accordance with this system, the coodinates of a plurality of geographical points (e.g., intersections, characterizing structures, etc.) are set and a desired destination is entered, whereupon a course is sought from each geographical point to the desired destination and outputted as guidance information. Navigation is possible even if distance, steering angle and geomagnetic sensors should happen to fail or even if these sensors are not provided. As a result, if the driver strays from a course or changes the desired destination, the apparatus readily provides the driver with guidance to the destination. However, a system through which desired destination, present position and the like can be inputted in simple fashion is required for this navigation apparatus.
SUMMARY OF THE INVENTION
Accordingly, an object of the present invention is to provide a position input system for a vehicular navigation apparatus in which position inputs can be made simply and correctly by obtaining a map in the form of intersections and geographical points located between intersections and possessing useful guidance information.
Another object of the present invention is to provide a position input system, which is particularly effective when applied to a navigation system in which when the coordinates of a plurality of geographical points (e.g., intersections, characterizing structures, etc.) are set and a desired destination is inputted, a course for travel to the destination is sought at each geographical point and outputted as guidance information.
According to the present invention, the foregoing objects are attained by providing a position input system for a vehicular navigation apparatus in which guidance information for travel to a desired destination is outputted upon inputting the desired destination, comprising an input unit for inputting a desired destination and a present position, a memory unit for storing desired destination information and map information, and a display unit for outputting the desired destination information and map information in response to the input information from the input unit, wherein the map information is information indicative of intersections and a geographical point located between intersections and possessing useful guidance information, wherein a desired destination and present position are capable of being inputted by setting the useful guidance information between a road leading to and a road leading from the geographical point.
In accordance with the present invention, as shown for example in FIG. 5, a map is not obtained merely in the form of intersections but also includes geographical points, shown in the form of nodes, having useful guidance information (e.g., bridges, rivers, buildings and gasoline stations) at points between intersections. Thus, nodes indicate point data representing map coordinates, and some of the nodes are intersections. Arcs indicate line data and represent portions of the roads. By adopting such an arrangement, a landmark serving as useful guidance information can be provided as data between roads (i.e., between two arcs) connecting node numbers on either side of a node number n of a particular intersection.
Thus, in accordance with the invention, obtaining a map as intersections and information relating to a geographical point located between intersections and possessing useful information makes it possible to input positions simply and accurately. More specifically, positions are precise since they are selected from pre-stored information relating to geographical locations. Vehicle direction can also be inputted in accurate fashion where parking lot exits and intersections are concerned.
In displaying a list of place names, display is made by class to facilitate retrieval. Even if a display of a list of place names is limited to place names which appear with comparative frequency in order to reduce the place names displayed, it is possible to input all geographical point information by code number. Furthermore, it is possible to play a voice track or display a message when operating the apparatus and coordinate selectable items on a display screen by using the same color to identify them, thereby eliminating the chance of error.
Moreover, when the invention is applied to a navigation system in which the coodinates of a plurality of geographical points (e.g., intersections, characterizing structures, etc.) are set and a desired destination is entered, whereupon a course is sought to the desired destination at each geographical point and outputted as guidance information, the advantage of simplified position input is accompanied by the ability to obtain guidance information from any geographical point whatsoever after a desired destination is designated. Furthermore, after the present position is specified and entered, merely inputting a trigger signal makes it possible to readily set the next geographical point as a guidance point in acordance with the guidance information for travel to the desired destination.
Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram illustrating the construction of an embodiment of a navigation apparatus according to the present invention;
FIG. 2 is a diagram illustrating one example of a route sequence;
FIG. 3 is a flowchart of navigation processing according to the invention;
FIGS. 4(a) and 4(b) are useful in describing course exploration processing;
FIGS. 5(a) through 9(b) are views useful in describing the structure of data according to the invention;
FIG. 10 is a flowchart illustrating an example of a desired destination input method;
FIGS. 11(a) to 11(g) illustrate examples of screens displayed in accordance with the method of FIG. 10;
FIGS. 12 and 13 are flowcharts illustrating an example of a present position input method; and
FIGS. 14(a) to 14(e) illustrate examples of screens displayed in accordance with the method of FIGS. 12 and 13.
DESCRIPTION OF THE PREFERRED EMBODIMENT
An embodiment of the invention will now be described in detail with reference to the drawings.
As shown in FIG. 1, a navigation apparatus according to the invention comprises input means 1, a CPU 2, a display unit 3 such as a CRT or liquid crystal display panel, and a memory unit 4. The input unit 1 includes a keyboard 5 comprising a ten-key pad and function keys for inputting code numbers of predetermined geographical locations, such as a desired destination and present position (guidance location). A touch panel 6, light pen 7, mouse 8 or voice input means 9 may be employed instead of the keyboard 5. The memory unit 4 is a memory such as a CD-ROM in which network data indicative of geographical points, namely desired destination and present position, and other information are stored in advance. As will be described below, map data 10, a list 11 of intersections, a list 12 of desired destinations, road data 13 and a list 14 of regions are stored.
When a desired destination is designated by an input from the input means 1, the CPU 2 sets information for travel to the desired destination, by a method such as course exploration, in accordance with each geographical point stored in the memory unit 4, and stores this information in a memory such as a RAM. When present position information is entered by the input means 1, the display unit 3 outputs guidance information for this point. If only intersections serve as geographical points, the outputted guidance information is that for travel to the next intersection, such as an indication of a left or right turn, at the intersection serving as the guidance point. In a case where there is a second intersection encountered immediately after turning at the aforementioned next intersection, it is of course possible for the outputted guidance information to include the direction of the first turn along with information designating the proper lane to take after the turn, as well as the direction of the second turn and the associated guidance information. For example, the display unit can output guidance information relating to a course leading to a desired destination in accordance with the path sequence a, b, c, . . . shown in FIG. 2.
Processing associated with the navigation apparatus of the invention will now be described with reference to the flowchart of FIG. 3.
When the driver enters the code of a desired destination at a step (1) of the flowchart, a course exploration mode is established in which information for travel to the desired destination is set for all geographical points with the exception of the entered desired destination (step 2). When course exploration ends, a present position input mode is established, in which the driver inputs the code of his present position (step 3). When this is done, the proper direction of forward travel from this position is outputted (step 4). Next, when the driver inputs an intersection verification trigger (i.e., when a start input is made) at step (5), information for travel to a destination which is the next intersection is outputted (step 6). This is followed by step (7), at which monitoring is performed to see whether the intersection verification trigger or a signal from a present-position input button has been entered. If the present position input button has been pressed, the program returns to the processing of step (3). In other words, in accordance with this system, a trigger is inputted each time an intersection is verified providing that the vehicle is travelling as per instructions. If the vehicle strays from the instructed course and the driver notices this only after the vehicle has travelled to another intersection, the present position input button is pressed. Accordingly, whenever a trigger is inputted, guidance information relating to an intersection on a route leading to the desired destination is outputted in sequential fashion. When the present position input button is pressed, the present position input mode is established.
The course exploration processing of step (2) will now be described with reference to FIGS. 4(a) and (b). When the couse exploration input mode is established, as shown in FIG. 4(a), first the desired destination is set in a work area at step (11), after which forward directions from intersections near the destination are set at step (12). As shown in FIG. 4(b), the set forward directions include forward directions d1 at intersections before the destination, and forward directions d2 at intersections before the first-mentioned intersections. It is permissible to execute this course exploration after the processing of step (3) in FIG. 3, in which case course exploration would be performed whenever present position is inputted. Furthermore, since guidance information is outputted in response to the trigger input in accordance with the route set as a result of course exploration, the pertinent intersections are limited in number. Accordingly, it will suffice to provide guidance information solely for these intersections, thereby minimizing the information required.
The characterizing feature of the present invention, specifically a system for inputting the desired destination or present position, will now be described with reference to FIGS. 5 through 9.
FIGS. 5 through 9 illustrate the structure of data in accordance with the invention.
FIG. 5 is a diagram useful in describing the fundamental approach adopted in forming map data. As shown in FIG. 5(a), a map is not construed merely an intersections but also includes nodes, which are points having useful guidance information (e.g., bridges, rivers, buildings, gasoline stations, etc.) at geographical points between intersections. Thus, nodes indicate point data representing map coordinates, and some of the nodes are intersections. Arcs indicate line data and represent portions of the roads. By adopting such an arrangement, a landmark serving as useful guidance information can be provided as data between roads (i.e., between two arcs) connecting node numbers on either side of a node number n of an intersection depicted in FIG. 5(b).
FIG. 6 illustrates node series data. What is stored are east longitude and north latitude, namely the coordinates of the pertinent geographical point, for each node number, as well as the attribute which distinguishes the effective guidance information (e.g., bridges, rivers, buildings, gasoline stations) as data for each node number.
FIG. 7 illustrates an example of an intersection list, in which there are stored the code numbers of the intersections, the intersection names, the intersection numbers (numbers assigned only to those of the nodes that are intersections), the node numbers of two connecting nodes, as described above with reference to FIG. 5(b), the names of landmarks and attributes.
FIG. 8 illustrates an example of a desired destination list, in which there are stored code numbers, the names of desired destinations, parking lot numbers, the numbers of two connecting intersections connecting a desired destination, the directions of parking lots (whether a parking lot is on the left or right side of a road or straight ahead), the numbers of photographs of connecting intersections, the numbers of photographs of parking lot exits, block data for each region, and coordinates (east longitude, west latitude). The arrangement is such that the attribute of each desired destination is distinguishable by class. For example, the following numbers can be assigned to the most significant bits of code numbers to indicate class: 0 (sightseeing), 1 (public facility), 2 (lodgings), 3 (dining), 4 (place of business), 5 (gasoline station), 6 (intersection), 7 (parking lot), 8 (souvenirs), and other attribute data can be provided if desired. These desired destination data indicate parking areas near the desired destinations. If a desired destination is a parking lot, the driver is informed of the connecting intersection numbers, the direction of the parking lot (whether it is on the left or right side of a road or straight ahead), the photograph numbers of the connecting intersections and the photograph numbers of the parking lot exit. Thus, the driver is guided in positive fashion until the vehicle arrives at its final destination.
FIG. 9(a) illustrates an example of road data. As shown in FIG. 9(b), each road is assigned a road number(s) along with the direction(s) of traffic flow. The stored road data include, for each road number, the node numbers of starting and end points of the road, the number of a road having the same starting point, the number of a road having the same end point, road width, information relating to prohibitions, information relating to guidance not required (as when the driver need only continue travelling straight ahead), photograph numbers, the numbers of nodes, the leading addresses of node series data, length, etc.
Destination input will now be described with reference to FIGS. 10 and 11.
FIG. 10 is a flowchart of the associated processing. A step 101 calls for the screen shown in FIG. 11(a) to be displayed as a desired destination input. This screen is for indicating the different classes which can be displayed in selecting the destination. Items which appear frequently, such as "SIGHTSEEING", "LODGINGS", "DINING", "SOUVENIRS", "CODE NO. INPUT", "RETURN" are displayed in red as selectable items, and a desired class is selected by touch-panel input at step 102. Next, it is determined at step 103 whether "RETURN" on the screen of FIG. 11(a) has been inputted. If the answer is NO, it is determined at step 106 whether "CODE NO." has been inputted. If the answer received here is YES, then the program proceeds to step 112.
If a NO answer is received at step 106, the program proceeds to step 107, at which a list of parking lots (destinations) for the selected item are read in from the CD-ROM, whereupon the screen shown in FIG. 11(b) is displayed at step 108. At the same time, a voice track "SELECT YOUR DESIRED DESTINATION" is played. Here also items are displayed in the order of their popularity. By touching a "PREVIOUS PAGE" or "NEXT PAGE" key, parking lots can be brought to the screen and a desired parking lot can be selected. All of the input display sections are displayed in the color red. The last item in the display is the "CODE INPUT NO. INPUT" item. If "PREVIOUS PAGE" is entered at the first page, the program returns to step 101.
When a desired parking lot is inputted by the touch panel (step 109), a confirmation screen shown in FIG. 11(c) is displayed at step 110. Here the selected item is backlighted in, say, the color blue, while the other items appear in dark blue, so that the driver may easily confirm the selection made. If "CANCEL" is pressed, the program returns to step 108. If OK is pressed, it is determined at step 111 whether the name of a parking lot has been inputted or a code number. If the name of a parking lot is the desired destination, the program proceeds to step 119, where data corresponding to the name of the parking lot are read from the CD-ROM and set in the memory area of the CPU.
When a change is made in the code number input at step 111, or when code number input is selected at step 106, a code number input screen shown in FIG. 11(d) is displayed at step 112, after which a desired parking lot code number is inputted from the touch panel at step 113. It is then determined at a step 114 whether the code number designation is erroneous. If it is, step 115 calls for display of a message reading "CODE NO. DESIGNATION IS INCORRECT" and the program returns to step 113. If the code number designation is correct, then it is determined at step 116 whether the code number is an intersection code. If it is not an intersection code, the program proceeds to step 118, at which the desired destination is displayed automatically, as shown in FIG. 11(e). If the "OK" key is pressed, the program proceeds to step 119, at which data corresponding to parking lot name are read out of the CD-ROM and set in the memory area of the CPU. The program returns to step 113 if "CANCEL" is pressed.
If the code number designated at step 116 is indicative of an intersection, a message reading "INTERSECTION CODE NO. CANNOT BE ENTERED", which is shown in FIG. 11(f), is displayed at step 117 and the program returns to step 113. When "RETURN" is inputted at step 101 in execution of the above routine, the screen of FIG. 11(g) is displayed through steps 103, 104. If the driver presses "OK", the initial departure point data are copied in the desired destination storage area at step 105.
Input of present position at an intersection will now be described with reference to FIGS. 12 through 14.
FIG. 12 is a flowchart of processing for setting the position of an automotive vehicle. Step 131 of the flowchart calls for display of a message, shown in FIG. 14(a), requesting entry of the name of an intersection. In response to the message, the driver continues driving straight ahead until an intersection having a name is passed, whereupon the driver immediately stops the vehicle and enters the intersection number (step 132) while referring to an instruction manual. When this is done, a code number input screen shown in FIG. 14(b) is displayed at step 133 and the code number is inputted by the touch panel at step 134, in response to which the name of the intersection shown in FIG. 14(c) is displayed at step 135. If the entered code number is incorrect, a message to this effect will be displayed at this time.
Next, it is determined at step 136 whether the name of the intersection is "OK" or is to be cancelled. If "CANCEL" is pressed, the program returns to step 133. If "OK" is pressed, processing for displaying a vehicle position input screen is executed at step 137 and the screen shown in FIG. 14(d) is displayed. Here the node data inputted by code number is read from the map data and the shape of the intersection is displayed based on the data indicative of the arcs connected to this intersection. In addition, the numbers of the roads leading to the intersection are displayed on the roads so that the road number can be entered from the ten-key pad. As shown in FIG. 14(d), only those keys corresponding to the intersection road numbers are displayed in, for example, the color red. Further, the location of a landmark is displayed at the position of a line segment bisecting the angle formed by two arcs, and the name of the landmark is displayed as well.
When the driver enters the number of the road on which the vehicle is presently located while referring to the location of the landmark, a screen such as that shown in FIG. 14(e) is displayed at step 138. This screen calls for the driver to confirm the number of the road on which the vehicle is presently located. Step 139 calls for the driver to press "OK" or "CANCEL". The program returns to step 137 if "CANCEL" is pressed. If "OK" is pressed, a location that is a predetermined distance (e.g., 70 m) from the starting point node on the designated arc is set as the vehicle position at step 140.
FIG. 13 is a flowchart of processing for displaying the vehicle position input screen of step 137.
With regard to the shape of the intersection on the screen of FIG. 14(e), first an intersection number Co is specified by inputting the name of the intersection using a code number. On the basis of this intersection number Co, roads leading to the intersection designated as a starting point, namely roads which include the designated intersection, are selected from the arc and road data (FIG. 9). Then, from the leading address of the node series data, the node series data of FIG. 6 are transformed from map coordinates to screen coordinates and displayed. This will now be described in greater detail.
With reference to FIG. 9, assume that intersection number (2) is designated by designating its intersection code number. In such case, roads which include intersection number (2) are retrieved from the starting point data contained in the road data (the loop of steps 202, 203, 204, 215). When this is done, road number 2 whose starting point is 2 is found, and the leading address 200 of the node series data representative of this road can be read out of memory (step 205). Though a screen is displayed based on the east-longitude and north-latitude data contained in the node series data, it is required, with regard to the initial data, to determine transformation equations for dealing with the transformation from the east-longitude and north-latitude data to the screen coordinates (j=0, f=0, step 208). The node series data represent a sequence of nodes which include intersections located between intersections, and the east longitude and north latitude of intersection number (2) are the initial data of the node sequence from address 200. The east longitude and north latitude are stored as EXo, EYo, and from an enlargement coefficient a we have
X=EX.sub.o ×a+bx, Y=EY.sub.o ×a+by
In accordance with the transformation equations for the transformation to the screen coordinates (X,Y), bx, by are decided in such a manner that EXo, EYo are obtained at the center of the screen when X=0, Y=0 hold. At this time a flag f indicating that the transformation coefficient has already been decided is set to 1 (step 210). As a result, step 210 is not traversed in subsequent processing. Thus, it is possible to transform the coordinates of intersection (2) to the center of the screen where the screen coordinates are (0,0), and a starting point Xo, Yo for painting a straight line on the screen is made 0 at step 211.
Subsequent processing involves sequentially reading out node series data (step 207), effecting the transformation into screen coordinates, storing the results as X1, Y1 and painting straight lines on the screen based on the screen coordinates of the immediately preceding node (steps 212, 214). By repeating this processing, a road is painted from intersection number (2) to intersection number (1). As for escaping from this loop of the program, the painting of one road is assumed to end (step 213) when X1, Y1 calculated at step 212 are outside a predetermined screen coordinate area. Then a road having (2) as its starting point number is again retrieved from the road data of FIG. 9 and road number (3) is found (steps 202, 203). The node series data are then read out of memory and painting of a road from intersection number (2) is again carried out. However, since the initial data at address 300 are the same as EXo, EYo read out previously, X and Y are made 0 (steps 217, 211). By repeating the foregoing processing, road numbers 2, 3 are painted from intersection number (2) to each of the other intersections within the limits of display screen. This ends processing for displaying the shape of the intersection.
When the display of the shape of the intersection ends, processing for the display of the landmark position is performed starting from step 219 on the basis of the intersection list data, road data and node series data. In the intersection list data, two intersection numbers Co1, Co2 are indicated as connecting intersections. Assuming that these are intersections (3), (4) in FIG. 9, the landmark is displayed at the angle formed by intersections (2), (3), (4). The roads which include the intersection (3) as starting point are retrieved from the road data, whereby road number 4 is found. East-longitude and north-latitude data contained in the node series data are read from the node series data address of this road, these data are stored as EX1, EY1, and a flag f1 indicating that the coordinates of the first connecting intersection have been set is made 1 (steps 223-225). Similarly, east-longitude and north-latitude data for intersection number (4) are stored as EX2, EY2. By means of the foregoing processing, coordinate data for the intersections (2), (3) and (4) are stored as (EXo,EYo), (EX1,EY1) and (EX2,EY2), respectively. A line segment which bisects the angle formed by these intersections is obtained from these coordinates (step 232), the coordinates of a point a predetermined distance from (EXo,EYo) are made (Xo,Yo) (step 233), and these coordinates are transformed into screen coordinates by the transformation equations obtained at step 210. As a result, the landmark is displayed.
The present invention is not limited to the foregoing embodiment but can be modified in various ways.
For example, though it is described in the above embodiment that destinations and intersections are inputted by entering code numbers, data indicative of Japanese kana characters or Roman letters can be provided as the destination and intersection data and these can be inputted by character retrieval.
Further, it is permissible to adopt an arrangement in which the conventional set-up provided with a distance sensor or steering sensor is combined with the navigation apparatus of the present invention. The resulting system can be adapted in such a manner that, rather than the next geographical point being identified only when a switch is operated by the driver, the navigation apparatus identifies predetermined geographical points automatically by other means and changes over the guidance information delivered to the driver each time.
With a combination of the present invention and the conventional system, it can be so arranged that a region having a simple road network, such as only a single road, is handled by the conventional system having the sensors, while a region having a complicated road network is dealt with by the system of the present invention. It can also be so arranged that the navigation apparatus of the present invention is used as a back-up if the conventional navigation apparatus fails.
It is also possible to provide information relating to the distances between geographical points at which guidance is given, determine distance by a distance sensor and then urge the driver to input the next geographical point (i.e., to make a trigger input) by a voice track or visual display.
Further, in a case where it is arranged so that the driver can set a desired course, such as a route along back roads or a route along a principal road, each geographical point can be provided with information indicating whether the point is on a route of the desired category (i.e., back road or principal road). By designating the category of road desired in the setting of the guidance information, course exploration can be carried out solely in terms of the geographical points along routes of the desired type.
As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the appended claims.

Claims (5)

What is claimed is:
1. A navigation system for a vehicle having a position input system for assisting vehicle navigation to a desired destination, which comprises:
input means for inputting said desired destination and for inputting a present intersection position in terms of characters or code numbers identifying the present intersection;
memory means for storing information about said desired destination and map information; said map information including an intersection list and a road data list; said intersection list including information of landmarks located in the vicinity of respective intersections; said road data list including starting point intersection information and ending point intersection information;
display means for displaying guidance information; and
control means for controlling said display means to display the present intersection, roads in said road data list having starting point information corresponding to said intersection and a landmark in the vicinity of said intersection, wherein said control means includes means for determining a line bisecting an angle between two roads displayed on said display means and intersecting at said intersection, and means for positioning the display of the landmark on the bisecting line a selected distance from said intersection;
said input means further including means for inputting a present road position by identifying one of said displayed roads relative to said landmark as displayed on said display means.
2. The system according to claim 1, wherein said desired destination is a parking lot.
3. The system according to claim 1, wherein said desired destination is inputted according to class or by code number.
4. The system according to claim 1, wherein the desired destination and the map information is stored in a CD-ROM.
5. A navigation system for a vehicle having a position input system for assisting vehicle navigation to a desired destination, which comprises:
input means for inputting a present position in terms of characters or code numbers and for inputting said desired destination by first selecting a class of destinations and then selecting a destination from the selected class;
memory means for storing information about said desired destination and map information; said map information including intersection and road data, and a landmark located between intersections;
display means for displaying guidance information; and
control means for controlling said display means to firstly display a plurality of classes of destinations and secondly to display destinations corresponding to said selected class of destinations, wherein said displayed destinations are in response to the selection of said class of destinations causing said destinations to be displayed on said display means, and said desired destination is selected by identifying one of said destinations displayed on said display means;
and wherein said desired destinations information includes a list of desired destinations each having classification data corresponding to one of said plurality of classes of destinations, and said displayed destinations are selected by determining destinations in said list of desired destinations which have classification data corresponding to said selected class.
US07/618,021 1987-12-28 1990-11-26 Position input system for vehicular navigation apparatus Expired - Fee Related US5115399A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP62-333052 1987-12-28
JP62333052A JPH01173820A (en) 1987-12-28 1987-12-28 Position input system for navigation device for vehicle

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US07290202 Continuation 1988-12-27

Publications (1)

Publication Number Publication Date
US5115399A true US5115399A (en) 1992-05-19

Family

ID=18261728

Family Applications (1)

Application Number Title Priority Date Filing Date
US07/618,021 Expired - Fee Related US5115399A (en) 1987-12-28 1990-11-26 Position input system for vehicular navigation apparatus

Country Status (3)

Country Link
US (1) US5115399A (en)
EP (1) EP0323230A3 (en)
JP (1) JPH01173820A (en)

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5262775A (en) * 1992-04-07 1993-11-16 Zexel Corporation Navigation system with off-route detection and route recalculation
US5274560A (en) * 1990-12-03 1993-12-28 Audio Navigation Systems, Inc. Sensor free vehicle navigation system utilizing a voice input/output interface for routing a driver from his source point to his destination point
US5340061A (en) * 1991-05-27 1994-08-23 Sextant Avionique Method and device for revising the lateral flight plan of an aircraft
US5359527A (en) * 1991-11-06 1994-10-25 Mitsubishi Denki Kabushiki Kaisha Navigation system for vehicle
US5369588A (en) * 1991-08-09 1994-11-29 Mitsubishi Denki Kabushiki Kaisha Navigation system for motor vehicles
US5377113A (en) * 1992-05-27 1994-12-27 Zexel Corporation Navigation system for use in vehicle
US5406389A (en) * 1991-08-22 1995-04-11 Riso Kagaku Corporation Method and device for image makeup
US5420795A (en) * 1992-12-08 1995-05-30 Greene; Leonard M. Navigational system for determining and displaying the position of a vessel on a navigational chart
US5454062A (en) * 1991-03-27 1995-09-26 Audio Navigation Systems, Inc. Method for recognizing spoken words
US5486822A (en) * 1990-11-09 1996-01-23 Sumitomo Electric Industries, Ltd. Optimum route determination
US5506779A (en) * 1993-05-13 1996-04-09 Matsushita Electric Industrial Co., Ltd. Route searching apparatus
US5519809A (en) * 1992-10-27 1996-05-21 Technology International Incorporated System and method for displaying geographical information
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks
US5528501A (en) * 1994-03-28 1996-06-18 At&T Corp. System and method for supplying travel directions
US5543788A (en) * 1991-07-12 1996-08-06 Fujitsu Limited Map management system in geographic information management system
US5544061A (en) * 1993-05-28 1996-08-06 Aisin Aw Co., Ltd. Navigation system with destination set by longitude and latitude coordinate points
US5592389A (en) * 1990-12-03 1997-01-07 Ans, Llp Navigation system utilizing audio CD player for data storage
US5608635A (en) * 1992-04-14 1997-03-04 Zexel Corporation Navigation system for a vehicle with route recalculation between multiple locations
US5646856A (en) * 1989-06-08 1997-07-08 Kaesser; Juergen Vehicle navigation system
US5748840A (en) * 1990-12-03 1998-05-05 Audio Navigation Systems, Inc. Methods and apparatus for improving the reliability of recognizing words in a large database when the words are spelled or spoken
US5754430A (en) * 1994-03-29 1998-05-19 Honda Giken Kogyo Kabushiki Kaisha Car navigation system
US5798733A (en) * 1997-01-21 1998-08-25 Northrop Grumman Corporation Interactive position guidance apparatus and method for guiding a user to reach a predetermined target position
US5819200A (en) * 1996-02-14 1998-10-06 Zexel Corporation Method and apparatus for selecting a destination in a vehicle navigation system
US5884219A (en) * 1996-10-10 1999-03-16 Ames Maps L.L.C. Moving map navigation system
US5910782A (en) * 1997-02-25 1999-06-08 Motorola, Inc. On-board vehicle parking space finder service
US5925091A (en) * 1995-06-12 1999-07-20 Alpine Electronics, Inc. Method and apparatus for drawing a map for a navigation system
US5941930A (en) * 1994-09-22 1999-08-24 Aisin Aw Co., Ltd. Navigation system
US5945985A (en) * 1992-10-27 1999-08-31 Technology International, Inc. Information system for interactive access to geographic information
US5987375A (en) * 1996-02-14 1999-11-16 Visteon Technologies, Llc Method and apparatus for selecting a destination in a vehicle navigation system
US6088649A (en) * 1998-08-05 2000-07-11 Visteon Technologies, Llc Methods and apparatus for selecting a destination in a vehicle navigation system
US6282489B1 (en) * 1993-05-28 2001-08-28 Mapquest.Com, Inc. Methods and apparatus for displaying a travel route and generating a list of places of interest located near the travel route
DE10018195A1 (en) * 2000-04-12 2001-10-25 Bayerische Motoren Werke Ag Character-by-character input device for vehicle navigation system, has display unit that shows selected symbols such that display sequence of selected symbols is based on frequency distribution
US6314368B1 (en) * 1982-11-08 2001-11-06 Hailemichael Gurmu Vehicle guidance system and method therefor
US6314370B1 (en) 1996-10-10 2001-11-06 Ames Maps, Llc Map-based navigation system with overlays
US20020059207A1 (en) * 2000-07-27 2002-05-16 Nec Corporation Information search/presentation system
US6408307B1 (en) 1995-01-11 2002-06-18 Civix-Ddi, Llc System and methods for remotely accessing a selected group of items of interest from a database
US20040036649A1 (en) * 1993-05-18 2004-02-26 Taylor William Michael Frederick GPS explorer
US6735516B1 (en) 2000-09-06 2004-05-11 Horizon Navigation, Inc. Methods and apparatus for telephoning a destination in vehicle navigation
US20040204844A1 (en) * 2003-01-24 2004-10-14 Yonglong Xu Navigational device
US20040243300A1 (en) * 2003-05-26 2004-12-02 Nissan Motor Co., Ltd. Information providing method for vehicle and information providing apparatus for vehicle
US20060015246A1 (en) * 2004-07-15 2006-01-19 Alvin Hui Method and apparatus for specifying destination using previous destinations stored in navigation system
US20060037990A1 (en) * 2002-05-03 2006-02-23 Geise Doran J System to navigate within images spatially referenced to a computed space
US20060171308A1 (en) * 2005-01-31 2006-08-03 Jung Edward K Method and system for interactive mapping to provide goal-oriented instructions
US20060171325A1 (en) * 2005-02-03 2006-08-03 Jung Edward K Interactive queued mapping method and system
US20060181546A1 (en) * 2005-02-15 2006-08-17 Jung Edward K Interactive key frame image mapping system and method
US20060198550A1 (en) * 2005-02-25 2006-09-07 Jung Edward K Image mapping to provide visual geographic path
US20060247853A1 (en) * 2005-04-30 2006-11-02 Jung Edward K Map display system and method
US20060284767A1 (en) * 1995-11-14 2006-12-21 Taylor William M F GPS explorer
US20070186159A1 (en) * 2006-02-08 2007-08-09 Denso International America, Inc. Universal text input method for different languages
US20070233658A1 (en) * 2006-03-31 2007-10-04 Aol Llc Identifying a result responsive to location data for multiple users
US20080086455A1 (en) * 2006-03-31 2008-04-10 Aol Llc Communicating appointment and/or mapping information among a calendar application and a navigation application
US20080140313A1 (en) * 2005-03-22 2008-06-12 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Map-based guide system and method
US20080147312A1 (en) * 2005-03-22 2008-06-19 Searete Llc Map-based guide system and method
US20080167937A1 (en) * 2006-12-29 2008-07-10 Aol Llc Meeting notification and modification service
US20080167938A1 (en) * 2006-12-29 2008-07-10 Aol Llc Reserving a time block in a calendar application to account for a travel time between geographic locations of appointments
US20080215435A1 (en) * 2005-03-22 2008-09-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Map-based guide system and method
US20090076720A1 (en) * 2005-02-25 2009-03-19 Jung Edward K Y Image mapping to provide visual geographic path
US7743056B2 (en) 2006-03-31 2010-06-22 Aol Inc. Identifying a result responsive to a current location of a client device
US20100235078A1 (en) * 2009-03-12 2010-09-16 Microsoft Corporation Driving directions with maps and videos
US20110098918A1 (en) * 2009-10-28 2011-04-28 Google Inc. Navigation Images
US8103313B2 (en) 1992-11-09 2012-01-24 Adc Technology Inc. Portable communicator
US9214033B2 (en) 2005-06-01 2015-12-15 Invention Science Fund I, Llc Map display system and method
US9286729B2 (en) 2005-02-25 2016-03-15 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US9702713B2 (en) 2005-01-31 2017-07-11 Searete Llc Map-based guide system and method

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
DE4334701C2 (en) * 1992-10-12 1997-01-16 Maspro Denko Kk Navigation system and navigation method with a route determination process that is able to determine a desired route quickly and completely
JP2598857B2 (en) * 1992-10-12 1997-04-09 マスプロ電工株式会社 Vehicle travel route setting device
JP2603789B2 (en) * 1992-11-16 1997-04-23 マスプロ電工株式会社 Vehicle travel route guidance device
US5398189A (en) * 1992-11-16 1995-03-14 Masprodenkoh Kabushikikaisha Navigation system for motor vehicles
JP2556650B2 (en) * 1993-03-31 1996-11-20 マスプロ電工株式会社 Vehicle travel route setting device
US5537324A (en) * 1993-08-07 1996-07-16 Aisin Aw Co., Ltd. Navigation system
US5784059A (en) 1994-09-16 1998-07-21 Aisin Aw Co., Ltd. Vehicle navigation system with destination selection using hierarchical menu arrangement with selective level skipping
US7268700B1 (en) 1998-01-27 2007-09-11 Hoffberg Steven M Mobile communication device
US8364136B2 (en) 1999-02-01 2013-01-29 Steven M Hoffberg Mobile system, a method of operating mobile system and a non-transitory computer readable medium for a programmable control of a mobile system
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4763270A (en) * 1985-03-20 1988-08-09 Nissan Motor Company, Limited Navigation system for a vehicle and method therefor
US4774672A (en) * 1985-03-11 1988-09-27 Nissan Motor Company, Limited Navigation system for automotive vehicle including feature of updating vehicle position at selected points along preset course
US4782447A (en) * 1985-04-03 1988-11-01 Nissan Motor Company, Ltd System and method for navigating a vehicle
US4792907A (en) * 1986-11-17 1988-12-20 Nippondenso Co., Ltd. Vehicle navigation system
US4796189A (en) * 1985-03-20 1989-01-03 Nissan Motor Company, Limited Navigation system for automotive vehicle with automatic navigation start and navigation end point search and automatic route selection
US4812845A (en) * 1983-02-24 1989-03-14 Nippondenso Co., Ltd. Vehicle running guide system
US4814989A (en) * 1985-05-30 1989-03-21 Robert Bosch Gmbh Navigation method for vehicles
US4827420A (en) * 1986-06-12 1989-05-02 Mitsubishi Denki Kabushiki Kaisha Navigation apparatus for vehicle

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS59132099A (en) * 1983-01-17 1984-07-30 株式会社デンソー Running information system for vehicle

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4812845A (en) * 1983-02-24 1989-03-14 Nippondenso Co., Ltd. Vehicle running guide system
US4774672A (en) * 1985-03-11 1988-09-27 Nissan Motor Company, Limited Navigation system for automotive vehicle including feature of updating vehicle position at selected points along preset course
US4763270A (en) * 1985-03-20 1988-08-09 Nissan Motor Company, Limited Navigation system for a vehicle and method therefor
US4796189A (en) * 1985-03-20 1989-01-03 Nissan Motor Company, Limited Navigation system for automotive vehicle with automatic navigation start and navigation end point search and automatic route selection
US4782447A (en) * 1985-04-03 1988-11-01 Nissan Motor Company, Ltd System and method for navigating a vehicle
US4814989A (en) * 1985-05-30 1989-03-21 Robert Bosch Gmbh Navigation method for vehicles
US4827420A (en) * 1986-06-12 1989-05-02 Mitsubishi Denki Kabushiki Kaisha Navigation apparatus for vehicle
US4792907A (en) * 1986-11-17 1988-12-20 Nippondenso Co., Ltd. Vehicle navigation system

Cited By (114)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6314368B1 (en) * 1982-11-08 2001-11-06 Hailemichael Gurmu Vehicle guidance system and method therefor
US5646856A (en) * 1989-06-08 1997-07-08 Kaesser; Juergen Vehicle navigation system
US5486822A (en) * 1990-11-09 1996-01-23 Sumitomo Electric Industries, Ltd. Optimum route determination
US5274560A (en) * 1990-12-03 1993-12-28 Audio Navigation Systems, Inc. Sensor free vehicle navigation system utilizing a voice input/output interface for routing a driver from his source point to his destination point
US5592389A (en) * 1990-12-03 1997-01-07 Ans, Llp Navigation system utilizing audio CD player for data storage
US6587786B1 (en) 1990-12-03 2003-07-01 Audio Navigation Systems, Inc. Sensor free vehicle navigation system utilizing a voice input/output interface for routing a driver from his source point to his destination point
US5748840A (en) * 1990-12-03 1998-05-05 Audio Navigation Systems, Inc. Methods and apparatus for improving the reliability of recognizing words in a large database when the words are spelled or spoken
US5454062A (en) * 1991-03-27 1995-09-26 Audio Navigation Systems, Inc. Method for recognizing spoken words
US5340061A (en) * 1991-05-27 1994-08-23 Sextant Avionique Method and device for revising the lateral flight plan of an aircraft
US5543788A (en) * 1991-07-12 1996-08-06 Fujitsu Limited Map management system in geographic information management system
US5369588A (en) * 1991-08-09 1994-11-29 Mitsubishi Denki Kabushiki Kaisha Navigation system for motor vehicles
US5406389A (en) * 1991-08-22 1995-04-11 Riso Kagaku Corporation Method and device for image makeup
US5359527A (en) * 1991-11-06 1994-10-25 Mitsubishi Denki Kabushiki Kaisha Navigation system for vehicle
US5262775A (en) * 1992-04-07 1993-11-16 Zexel Corporation Navigation system with off-route detection and route recalculation
US5608635A (en) * 1992-04-14 1997-03-04 Zexel Corporation Navigation system for a vehicle with route recalculation between multiple locations
US5377113A (en) * 1992-05-27 1994-12-27 Zexel Corporation Navigation system for use in vehicle
US5945985A (en) * 1992-10-27 1999-08-31 Technology International, Inc. Information system for interactive access to geographic information
US5519809A (en) * 1992-10-27 1996-05-21 Technology International Incorporated System and method for displaying geographical information
US8103313B2 (en) 1992-11-09 2012-01-24 Adc Technology Inc. Portable communicator
US5420795A (en) * 1992-12-08 1995-05-30 Greene; Leonard M. Navigational system for determining and displaying the position of a vessel on a navigational chart
US5506779A (en) * 1993-05-13 1996-04-09 Matsushita Electric Industrial Co., Ltd. Route searching apparatus
US20080024364A1 (en) * 1993-05-18 2008-01-31 Frederick Taylor William M GPS explorer
US20040036649A1 (en) * 1993-05-18 2004-02-26 Taylor William Michael Frederick GPS explorer
US6282489B1 (en) * 1993-05-28 2001-08-28 Mapquest.Com, Inc. Methods and apparatus for displaying a travel route and generating a list of places of interest located near the travel route
US6498982B2 (en) 1993-05-28 2002-12-24 Mapquest. Com, Inc. Methods and apparatus for displaying a travel route and/or generating a list of places of interest located near the travel route
US5544061A (en) * 1993-05-28 1996-08-06 Aisin Aw Co., Ltd. Navigation system with destination set by longitude and latitude coordinate points
US5528501A (en) * 1994-03-28 1996-06-18 At&T Corp. System and method for supplying travel directions
US5754430A (en) * 1994-03-29 1998-05-19 Honda Giken Kogyo Kabushiki Kaisha Car navigation system
US5525883A (en) * 1994-07-08 1996-06-11 Sara Avitzour Mobile robot location determination employing error-correcting distributed landmarks
US5941930A (en) * 1994-09-22 1999-08-24 Aisin Aw Co., Ltd. Navigation system
US6415291B2 (en) 1995-01-11 2002-07-02 Civix-Ddi, Llc System and methods for remotely accessing a selected group of items of interest from a database
US6408307B1 (en) 1995-01-11 2002-06-18 Civix-Ddi, Llc System and methods for remotely accessing a selected group of items of interest from a database
US5925091A (en) * 1995-06-12 1999-07-20 Alpine Electronics, Inc. Method and apparatus for drawing a map for a navigation system
US20070001875A1 (en) * 1995-11-14 2007-01-04 Taylor William M F GPS explorer
US20060284767A1 (en) * 1995-11-14 2006-12-21 Taylor William M F GPS explorer
US5832408A (en) * 1996-02-14 1998-11-03 Zexel Corporation Method and apparatus for selecting a destination in a vehicle navigation system
US5987375A (en) * 1996-02-14 1999-11-16 Visteon Technologies, Llc Method and apparatus for selecting a destination in a vehicle navigation system
US5819200A (en) * 1996-02-14 1998-10-06 Zexel Corporation Method and apparatus for selecting a destination in a vehicle navigation system
US6314370B1 (en) 1996-10-10 2001-11-06 Ames Maps, Llc Map-based navigation system with overlays
US5884219A (en) * 1996-10-10 1999-03-16 Ames Maps L.L.C. Moving map navigation system
US5798733A (en) * 1997-01-21 1998-08-25 Northrop Grumman Corporation Interactive position guidance apparatus and method for guiding a user to reach a predetermined target position
US5910782A (en) * 1997-02-25 1999-06-08 Motorola, Inc. On-board vehicle parking space finder service
US6088649A (en) * 1998-08-05 2000-07-11 Visteon Technologies, Llc Methods and apparatus for selecting a destination in a vehicle navigation system
DE10018195A1 (en) * 2000-04-12 2001-10-25 Bayerische Motoren Werke Ag Character-by-character input device for vehicle navigation system, has display unit that shows selected symbols such that display sequence of selected symbols is based on frequency distribution
US7039630B2 (en) * 2000-07-27 2006-05-02 Nec Corporation Information search/presentation system
US20020059207A1 (en) * 2000-07-27 2002-05-16 Nec Corporation Information search/presentation system
US6735516B1 (en) 2000-09-06 2004-05-11 Horizon Navigation, Inc. Methods and apparatus for telephoning a destination in vehicle navigation
US20060037990A1 (en) * 2002-05-03 2006-02-23 Geise Doran J System to navigate within images spatially referenced to a computed space
US8635557B2 (en) 2002-05-03 2014-01-21 205 Ridgmont Solutions, L.L.C. System to navigate within images spatially referenced to a computed space
US7827507B2 (en) 2002-05-03 2010-11-02 Pixearth Corporation System to navigate within images spatially referenced to a computed space
US20040204844A1 (en) * 2003-01-24 2004-10-14 Yonglong Xu Navigational device
US20040243300A1 (en) * 2003-05-26 2004-12-02 Nissan Motor Co., Ltd. Information providing method for vehicle and information providing apparatus for vehicle
US7474958B2 (en) * 2003-05-26 2009-01-06 Nissan Motor Co., Ltd. Information providing method for vehicle and information providing apparatus for vehicle
US20060015246A1 (en) * 2004-07-15 2006-01-19 Alvin Hui Method and apparatus for specifying destination using previous destinations stored in navigation system
US9965954B2 (en) 2005-01-31 2018-05-08 Edward K. Y. Jung Method and system for interactive mapping to provide goal-oriented instructions
US20110082639A1 (en) * 2005-01-31 2011-04-07 Searete Llc Method and system for interactive mapping to provide goal-oriented instructions
US9702713B2 (en) 2005-01-31 2017-07-11 Searete Llc Map-based guide system and method
US20060171308A1 (en) * 2005-01-31 2006-08-03 Jung Edward K Method and system for interactive mapping to provide goal-oriented instructions
US7729708B2 (en) 2005-01-31 2010-06-01 The Invention Science Fund I, Llc Method and system for interactive mapping to provide goal-oriented instructions
US8396001B2 (en) 2005-02-03 2013-03-12 The Invention Science Fund I, Llc Interactive queued mapping method and system
US20060171325A1 (en) * 2005-02-03 2006-08-03 Jung Edward K Interactive queued mapping method and system
US8311733B2 (en) 2005-02-15 2012-11-13 The Invention Science Fund I, Llc Interactive key frame image mapping system and method
US20060181546A1 (en) * 2005-02-15 2006-08-17 Jung Edward K Interactive key frame image mapping system and method
US7756300B2 (en) 2005-02-25 2010-07-13 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US8582827B2 (en) 2005-02-25 2013-11-12 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US8270683B2 (en) 2005-02-25 2012-09-18 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US8107691B2 (en) 2005-02-25 2012-01-31 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US20060198550A1 (en) * 2005-02-25 2006-09-07 Jung Edward K Image mapping to provide visual geographic path
US7734073B2 (en) 2005-02-25 2010-06-08 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US8077928B2 (en) 2005-02-25 2011-12-13 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US20110007093A1 (en) * 2005-02-25 2011-01-13 Searete Llc Image mapping to provide visual geographic path
US7764811B2 (en) 2005-02-25 2010-07-27 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US20110050721A1 (en) * 2005-02-25 2011-03-03 Searete Llc Image mapping to provide visual geographic path
US8805027B2 (en) 2005-02-25 2014-08-12 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US9286729B2 (en) 2005-02-25 2016-03-15 The Invention Science Fund I, Llc Image mapping to provide visual geographic path
US20110044511A1 (en) * 2005-02-25 2011-02-24 Searete Llc Image mapping to provide visual geographic path
US20090076720A1 (en) * 2005-02-25 2009-03-19 Jung Edward K Y Image mapping to provide visual geographic path
US8635014B2 (en) 2005-03-22 2014-01-21 The Invention Science Fund I, Llc Map-based guide system and method
US9188454B2 (en) 2005-03-22 2015-11-17 Invention Science Fund I, Llc Map-based guide system and method
US20080147312A1 (en) * 2005-03-22 2008-06-19 Searete Llc Map-based guide system and method
US20080140313A1 (en) * 2005-03-22 2008-06-12 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Map-based guide system and method
US20080215435A1 (en) * 2005-03-22 2008-09-04 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Map-based guide system and method
US7860648B2 (en) 2005-04-30 2010-12-28 The Invention Science Fund I, Llc Map display system and method
US20060247853A1 (en) * 2005-04-30 2006-11-02 Jung Edward K Map display system and method
US8392114B2 (en) 2005-04-30 2013-03-05 The Invention Science Fund I, Llc Map display system and method
US20090177375A1 (en) * 2005-04-30 2009-07-09 Searete Llc Map Display System and Method
US7522996B2 (en) 2005-04-30 2009-04-21 Searete Llc Map display system and method
US9214033B2 (en) 2005-06-01 2015-12-15 Invention Science Fund I, Llc Map display system and method
US20070186159A1 (en) * 2006-02-08 2007-08-09 Denso International America, Inc. Universal text input method for different languages
US20070233658A1 (en) * 2006-03-31 2007-10-04 Aol Llc Identifying a result responsive to location data for multiple users
US20100241351A1 (en) * 2006-03-31 2010-09-23 Aol Inc. Identifying a result responsive to a current location of a client device
US20080086455A1 (en) * 2006-03-31 2008-04-10 Aol Llc Communicating appointment and/or mapping information among a calendar application and a navigation application
US7941753B2 (en) 2006-03-31 2011-05-10 Aol Inc. Communicating appointment and/or mapping information among a calendar application and a navigation application
US9752890B2 (en) 2006-03-31 2017-09-05 Facebook, Inc. Identifying a result responsive to a current location of a client device
US9618358B2 (en) 2006-03-31 2017-04-11 Facebook, Inc. Identifying a result responsive to a current location of a client device
US9234762B2 (en) 2006-03-31 2016-01-12 Facebook, Inc. Identifying results responsive to a future location of a client device
US7743056B2 (en) 2006-03-31 2010-06-22 Aol Inc. Identifying a result responsive to a current location of a client device
US8560232B2 (en) 2006-12-29 2013-10-15 Facebook, Inc. Meeting notification and modification service
US8489329B2 (en) 2006-12-29 2013-07-16 Facebook, Inc. Meeting notification and modification service
US7869941B2 (en) 2006-12-29 2011-01-11 Aol Inc. Meeting notification and modification service
US8712810B2 (en) 2006-12-29 2014-04-29 Facebook, Inc. Reserving a time block in a calendar application to account for a travel time between geographic locations of appointments
US8073614B2 (en) 2006-12-29 2011-12-06 Aol Inc. Meeting notification and modification service
US9867014B2 (en) 2006-12-29 2018-01-09 Facebook, Inc. Meeting notification and modification service
US20110077860A1 (en) * 2006-12-29 2011-03-31 Aol Inc. Meeting notification and modification service
US8364400B2 (en) 2006-12-29 2013-01-29 Facebook, Inc. Meeting notification and modification service
US8554477B2 (en) 2006-12-29 2013-10-08 Facebook, Inc. Meeting notification and modification service
US9243911B2 (en) 2006-12-29 2016-01-26 Facebook, Inc. Meeting notification and modification service
US20080167938A1 (en) * 2006-12-29 2008-07-10 Aol Llc Reserving a time block in a calendar application to account for a travel time between geographic locations of appointments
US8554476B2 (en) 2006-12-29 2013-10-08 Facebook, Inc. Meeting notification and modification service
US20080167937A1 (en) * 2006-12-29 2008-07-10 Aol Llc Meeting notification and modification service
US20100235078A1 (en) * 2009-03-12 2010-09-16 Microsoft Corporation Driving directions with maps and videos
US9195290B2 (en) 2009-10-28 2015-11-24 Google Inc. Navigation images
US20110098918A1 (en) * 2009-10-28 2011-04-28 Google Inc. Navigation Images
US11768081B2 (en) 2009-10-28 2023-09-26 Google Llc Social messaging user interface

Also Published As

Publication number Publication date
EP0323230A3 (en) 1991-09-11
JPH01173820A (en) 1989-07-10
EP0323230A2 (en) 1989-07-05

Similar Documents

Publication Publication Date Title
US5115399A (en) Position input system for vehicular navigation apparatus
US4992947A (en) Vehicular navigation apparatus with help function
US5231584A (en) Navigation apparatus with non-volatile memory for return to initial departure point
US5121326A (en) Display system in navigation apparatus
US5103400A (en) Destination guidance method of vehicle navigating
US5191532A (en) Navigation apparatus
JP4124443B2 (en) Navigation device and destination specifying method
EP0314398A2 (en) Navigation apparatus based on present position calculating system
JPH09218048A (en) Drive simulation method
EP0348528A1 (en) Vehicle navigation system
JPH0962994A (en) Navigation device for vehicle
EP0355232A2 (en) Road drawing system for a navigation apparatus
EP0346483A1 (en) Navigation unit
JP3402836B2 (en) Navigation apparatus and navigation processing method
EP1406064A1 (en) Map display system
JPH09152353A (en) Navigation device with spot registration function
JP3594673B2 (en) Traffic information display
JP2725923B2 (en) Route guidance method for in-vehicle navigator
JP2624232B2 (en) Route guidance device for vehicles
JP2731387B2 (en) Vehicle navigation system
JP3024464B2 (en) Travel position display device
JP2741852B2 (en) Vehicle navigation system
JP3240708B2 (en) Map display device
JP3097377B2 (en) Route guidance device for vehicles
JPH01173823A (en) Destination input system for navigation device for vehicle

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
FP Lapsed due to failure to pay maintenance fee

Effective date: 20040519

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362