Method and system for writing and reading coded data

H - Electricity – 03 – M

Patent

Rate now

  [ 0.00 ] – not rated yet Voters 0   Comments 0

Details

354/67

H03M 7/00 (2006.01) G06T 9/00 (2006.01) H04N 7/26 (2006.01) H04N 7/30 (2006.01)

Patent

CA 2009848

A method of writing and reading coded data writes and reads the coded data which are coded by an orthogonal transform coding so that an original image can be restored progressively or sequentially, where the original image is divided into blocks of an arbitrary number of pixels and the coded data are obtained by coding quantization coefficients which are obtained when gradation levels of the pixels within each block are subjected to a two-dimensional discrete cosine transform. The method comprises the steps of extracting the quantization coefficients for each restoration stage, subjecting the extracted quantization coefficients to a variable length coding, writing the variable length coded quantization coefficients into storage means as the coded data, reading the coded data from the storage means, restoring a code length of the coded data based on the read coded data, extracting coded data corresponding to the restored code length from the read coded data, and outputting the extracted coded data as coded data required in each restoration stage.

LandOfFree

Say what you really think

Search LandOfFree.com for Canadian inventors and patents. Rate them and share your experience with other people.

Rating

Method and system for writing and reading coded data does not yet have a rating. At this time, there are no reviews or comments for this patent.

If you have personal experience with Method and system for writing and reading coded data, we encourage you to share that experience with our LandOfFree.com community. Your opinion is very important and Method and system for writing and reading coded data will most certainly appreciate the feedback.

Rate now

     

Profile ID: LFCA-PAI-O-1571569

  Search
All data on this website is collected from public sources. Our data reflects the most accurate information available at the time of publication.