Pattern classification system and method for collective...

G - Physics – 06 – K

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

G06K 9/66 (2006.01) G06K 9/32 (2006.01) H04N 7/18 (2006.01)

Patent

CA 2715971

A method for configuring a pattern recognition system begins by receiving object recognition data from at least one first local image processing system. The object recognition data is stored in at least one global database. Configuration data is determined for a second local image processing system based at least in part upon the received object recognition data from the at least one first image processing system, and then transmitted to the second local image processing system.

L'invention concerne un procédé de configuration d'un système de reconnaissance de motifs, consistant à recevoir des données de reconnaissance d'objet d'au moins un premier système de traitement d'image local. Les données de reconnaissance d'objet sont stockées dans au moins une base de données globale. Des données de configuration sont déterminées pour un second système de traitement d'image local au moins en partie en fonction des données de reconnaissance d'objet reçues du premier système de traitement d'image, puis transmises au second système de traitement d'image local.

LandOfFree

Say what you really think

Search LandOfFree.com for Canadian inventors and patents. Rate them and share your experience with other people.

Rating

Pattern classification system and method for collective... does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Pattern classification system and method for collective..., we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Pattern classification system and method for collective... will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFCA-PAI-O-2009245

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.