Method of detecting boundary structures in a video signal

H - Electricity – 04 – N

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

350/33

H04N 1/41 (2006.01) G06T 5/00 (2006.01) G06T 9/00 (2006.01) H04N 1/409 (2006.01)

Patent

CA 1278373

27371-153 ABSTRACT OF THE DISCLOSURE A method of detecting edge structures in a video signal, with a decision criterion being derived from the environment pixels of an image pixel so as to realize image coding with the least number of bits possible. The object of detecting all of the oblique edges, i.e edges which are not horizontal, vertical is accomplished in that an average is formed from the environment pixels of an image pixel and this average is then compared with each environment pixel so as to obtain an output signal having one of three-values (1, 0, -1) depending on whether the luminance or chrominance values of the particular environment pixel is above, within or below a given tolerance range. A conclusion as to the existence of an edge and its orientation is then drawn from the number of immediately con- secutive identical positive or negative values (1 or -1) of this possible three-valued signal for an image pixel and from the position of the changes in value within the sequence of environ- ment pixels.

524172

LandOfFree

Say what you really think

Search LandOfFree.com for Canadian inventors and patents. Rate them and share your experience with other people.

Rating

Method of detecting boundary structures in a video signal does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method of detecting boundary structures in a video signal, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method of detecting boundary structures in a video signal will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFCA-PAI-O-1210284

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.