\
   

 

Lectures and Lecture Schedule

Here you will find information pertaining to each weeks lecture.  In particular, the material you are responsible for (e.g., topic(s) to be covered, sections in the textbook, papers and additional notes), the powerpoint presentation and any relevant notes or comments.  I will aim to post the presentation (in pdf format) about one day prior to the lecture although there will be no guarantee.  Lecture notes posted prior to the lecture will be preliminary however, after the lecture, an updated presentation will be posted (depending on the lecture, I may modify the preliminary version slightly and although I will do my best to eliminate them, preliminary notes may contain minor errors). 

Date
Main Topic(s)
Textbook Sections
Powerpoint Slides
Notes/Comments
Week 1 Jan. 10
Introduction to image processing and examples of fields that use image processing. components of an image processing system  Chapter 1 (complete)
Final notes
 3Slides/page
 6Slides/page

After an introduction to the "administrative" details regarding the course (e.g., course outline etc.), this lecture will begin with an introduction to the field of digital image processing.  Terminology will be introduced including a definition of an image and digital image processing followed by a brief discussion on the many uses of digital processing and how it has impacted our lives.

Here is some further info. just for your own interest

Week 2 Jan. 17
Introduction to visual perception.  The electromagnetic (EM) spectrum.  Image acquisition, sampling and quantization. Chapter 2:
2.1, 2.1.1, 2.1.2, 2.1.3, 2.2, 2.3, 2.3.1, 2.3.2, 2.3.3, 2.3.4
, 2.4, 2.4.1, 2.4.2
Final notes
 3Slides/page
 6Slides/page
The first half of this lecture will begin with an introduction to the human visual system followed by a discussion of the electromagnetic spectrum (in greater detail than the introduction given during last week's lecture's).  Both of these topics on their own are extremely large and we can spend an entire course on them.  This lecture will simply introduce some fundamental concepts (terminology etc.) as required for digital image processing.  In the second half of the lecture, we will focus on image acquisition, sampling and quantization.  This topic should be somewhat of a review as you have covered the concepts in the Digital Signal Processing course for the 1-D case (e.g., 1-D signals).  here we will be concerned with 2-D signals.  This topic  includes a discussion on the various types of sensor arrangements used to sample an image (e.g., single sensor, 1D sensor array and 2D sensor arrays common in most CCD based digital cameras) in the spatial domain followed by the the methods used to quantize an image (e.g., sampling of an image with respect to intensity or gray-level).  A brief introduction to potential problems that sampling can lead to (e.g., aliasing) will be introduced although greater emphasis on this topic will be placed in future lectures.

Here are some links/references for your own interest.

Lab:
This week's lab: Lab 1.  You can ignore "Procedure 3" (last page of the lab).  There is no report required for this lab report however, you must complete the "Exercise" portion of the lab during the lab period and show the instructor prior to signing off.  In addition, the following assignment is to be completed and submitted at the beginning of the lecture (e.g., 6:05pm), the  following week (Monday, September 26 2005) - see also "Assignments" page.
Week 3 Jan. 24
Image acquisition, sampling and quantization (continued from last week).  Basic relationships between pixels Chapter Two:
2.4.3, 2.4.5, 2.5, 2.5.1, 2.5.2, 2.5.3, 2.5.4, 2.6
Final notes
 3Slides/page
 6Slides/page
In the first half of this lecture, we will continue our discussion on image sampling and acquisition.  In particular, we will discuss spatial and gray-level  resolution, aliasing and a brief introduction to image up-sampling and down-sampling (image shrinking and zooming).  In the second half of the lecture, we will examine several basic relationships amongst pixels.  The lecture will end with a discussion of linear and non-linear operators.  Some examples on the board will follow.

Lab:
This week's lab: Lab 2.  In this lab, both LabView and IMAQ will be used. You will also require a digital camera and related equipment.   A report must be submitted with this lab. 
The lab report and assignment is due the following week (Monday, October 3 2005)at the start of the lab - see also "Assignments" page.
Week 4 Jan. 31
Linear and non-linear operations. Image enhancement in the spatial domain Basic gray-level transformations (image negatives, log transforms, power and piece-wise transforms).  Histogram processing Chapter Two:
2.6

Chapter Three
3.1, 3.2, 3.2.1, 3.2.2, 3.2.3, 3.2.4,
Final notes
 3Slides/page
 6Slides/page
This lecture will begin by continuing our discussion regarding linear and non-linear operators that we began discussing during last week's lecture.  Following this, image enhancement in the spatial domain will be introduced. Spatial domain image enhancement refers to modifying the image in some manner via operations performed directly on the pixels (e.g., intensity values) themselves.  The mathematical definition of an image operator will be introduced and several common image enhancement operators will be covered.   Finally, image histograms will be introduced.

Here are some links/references for your own interest.
Lab:
This week's lab: Lab 3.  In this lab, both LabView and IMAQ will be used. You will also require a digital camera and related equipment.  No lab report is required for this lab although there is an assignment due February 14, 2006.
Week 5
Feb. 7
Image enhancement in the spatial domain:  Histograms and histogram processing, Arithmetic operators, Basics of spatial filtering, Smoothing spatial filters Chapter Three
3.3 (up to page 90), 3.4, 3.4.1, 3.5, 3.6, 3.6.1
Final notes
 3Slides/page
 6Slides/page
The first part of this lecture will continue with our discussion on image enhancement in the spatial domain and an introduction to histograms and histogram processing that we started last lecture (Jan. 31).  We will then proceed to discuss arithmetic operations. 

The second part of this lecture will introduce the concept of filtering an image in the spatial domain.  In particular, we will discuss the "mechanics" of filtering an image with a filter (the type of filter depends on the application however, the "mechanics" remain the same) in the spatial domain (e.g., by directly manipulating the image's pixel gray levels).  A filter is also known as a template, kernel, mask among other names.  We briefly discussed the concept of a template (mask etc. during the Jan. 31 lecture).  We will build upon the concepts introduced there.  Once we review the concepts introduced previously, we will go into further depth.  Once we have familiarized ourselves with spatial filtering using a kernel, an application of it (averaging/blurring) will be introduced.  This is actually a very important topic and it is highly recommended you read over the lecture notes and the appropriate sections in the book very carefully! 
Week 6 Feb. 14
Review of the basics of spatial filtering, Smoothing spatial filters, Sharpening spatial filters Chapter Three 3.6, 3.6.1, 3.6.2, 3.7, 3.7.1 (up to page 125) ,  3.7, 3.7.1, 3.7.3, 3.8 Final notes
 3Slides/page
 6Slides/page
This lecture will continue with the material regarding spatial filtering that we started last week.  A brief review of the "mechanics" of spatial filtering will be presented followed by a discussion on how we can use specific kernels to perform various operations on an image by using the "mechanics" of spatial filtering.  In particular, we will examine smoothing spatial filters which are used to remove noise from an image (and also "blur" an image), followed by sharpening spatial filters which are used to "sharpen" an image and "highlight"sharp transitions between intensity values (e.g.,edges).

Additional material relevant to the lecture:
  • Try some simple examples of spatial filtering on your own.  Begin by computing the double summation convolution formula introduced in class for a 3 x 3 mask with all coefficients equal to 1, then try a 5 x 5 mask.  The material provided in the notes should be sufficient for you to do this (but of course you also have the book to look at as well)!
Lab:
This week's lab: Lab 4.  In this lab, IMAQ will be used along with a  digital camera and related equipment.  There is a lab report is required for this lab.
Week 7 Feb. 21



Reading Week - No Lectures or Labs!
Week 8 Feb. 28



Mid-term exam
The following information may be useful to you:

Strike



Strike from March 7 - 24
Week 9 Mar. 28
Sharpening spatial filters, Introduction to edges, Introduction to the first and second order derivatives, Combining spatial filtering techniques Chapter Three 3.6, 3.6.1, 3.6.2, 3.7, 3.7.1 (up to page 125) ,  3.7, 3.7.1, 3.7.3, 3.8 Final notes
 3Slides/page
 6Slides/page
In this lecture we will continue our discussion of sharpening spatial filters.  This discussion will then provide a brief introduction to edges and how edges can be modeled followed by a discussion on the first order digital derivative.  We will then focus on the second order derivative and how it can be modeled.  Both the first and second order derivatives care used to detect edges.  Finally, the lecture will conclude with a brief discussion on how both smoothing and sharpening filters can be combined.  Examples applications will be provided.

Lab:
We will be working on lab 6 this week.  The lab may span two weeks depending on how far we get. This lab deals primarily with Matlab and it is recommended you read the lab prior to the lab period.  The lab also makes use of the following image: lenna.jpg (this image is actually one of the most popular images in the computer vision/image processing fields - it dates back over 25-30 years!)!  There is a lab report due for this lab and it is due March 21, 2006 (if the lab is completed in one week) or March 28, 2006 if the lab is completed in two weeks.
Week 10
April 4
Brief introduction to the  1D Fourier transform and its properties, Introduction to the 2D Fourier transform and its properties, Introduction to Filtering in the Fourier domain Chapter Four
4.1, 4.2, 4.2.1
Final notes
 3Slides/page
  6Slides/page

Background (from Richard G. Lyons Understanding Digital Signal Processing book):

The Arithmetic of Complex Numbers

1D discrete Fourier transform example
In this lecture we will begin examining the Fourier transform. In particular, we will begin with some background to the Fourier transform followed by an introduction to the one dimensional Fourier transform and some of its properties.  Although we are interested in the the two-dimensional Fourier transform in this course, it can be generalized from the one-dimensional Fourier transform hence we will begin with the one-dimensional case.  We will briefly review the 1D Fourier transform and then begin a discussion on the 2D Fourier transform followed by a discussion of filtering the frequency domain, where we will examine both low and high pass filters in detail. 

I suggest you take a look at the notes I have added from Richard Lyons book on Arithmetic of complex numbers and the 1D Fourier example.

Finally, the following links are provided for your interest
  • More about Jean Baptiste Joseph Fourier (a biography)
  • For some fun, download and listen to "Fourier's Song" (mp3) by Dr. Time and Brother Fre(quency) (well, in reality by Dr. Robert Williamson of the Australian National University).  Link includes lyrics to the song - its actually quite amusing!
Week 11
April 11
Continue with our discussion of the 2D Fourier transform, Introduction to filtering in the frequency domain, Properties of the frequency domain, Convolution Theorem, Gaussian filters Chapter Four
4.1, 4.2, 4.2.1, 4.2.2, 4.2.3, 4.2.4
Final notes
 3Slides/page
  6Slides/page

Background (from Richard G. Lyons Understanding Digital Signal Processing book):

The Arithmetic of Complex Numbers

1D discrete Fourier transform example
In this lecture we will continue our discussion of the 2D Fourier transform followed by a discussion of filtering the frequency domain, where we will examine both low and high pass filters in detail.  We will then discuss the Gaussian filter followed by a discussion on the Convolution Theorem (this is very important!)  I would also like to review the "1D discrete Fourier transform example" from Lyon's book.

Lab:
We will be working on Lab 7 this week. 
Lab 7 examines edge detection using IMAQ Vision Builder.   No camera and equipment are required for this lab.  This lab should also be fairly straightforward to complete.  A lab report is required for this lab.

ReminderSummary regarding the DFT:

Given a 1D input sequence x[n] of size M (e.g., M samples), after performing a DFT operation on the input sequence, we obtain our output DFT sequence X[m] also of size M.

  • x denotes the discrete input signal.
  • X denotes the discrete DFT output.
  • The size (number of samples) in our input sequence is equal to the number of samples in our output sequence (e.g., equal to M for both).  In other words, the size of the input signal (sequence) determines the size of our output signal (sequence).
  • M denotes the size of our input signal (e.g., number of input samples) and the size of the output DFT signal (e.g., number of DFT samples).  Remember, input sequence and output sequence are discrete!
  • n denotes the index into our input signal (e.g., x[n] is the nth input sample).
  • m denotes the index into our DFT output signal (e.g., X[m] = mth DFT output).
  • Keep in mind, m and n are simply indices - any letter can be used to denote an index but we will use these two as defined above for consistency.
Week 12 Apr. 18
Convolution theorem,  Discontinuity detection and Image segmentation Chapter 10 10.1, 10.1.1, 10.1.2, 10.1.3, 10.3.3, 10.3.4, 10.4, 10.4.1, 10.4.2, 10.4.3 Final notes
 3Slides/page
  6Slides/page
In this lecture we will conclude our discussion on frequency domain filtering by examining the convolution theorem in greater detail.  We will then spend a portion of the remainder of the lecture discussing discontinuity detection and image segmentation.  In particular, we will examine point and line detection in addition to applying a threshold to an image in order to detect objects within an image.  We will spend the last 15-20 minutes of the lecture to go over solutions of the mid-term exam. 

Lab:
We will be continue working on lab 8 this week. 
This is the last lab of the term and should be straightforward to complete.  You will require the use of a camera and camera equipment.  No lab report required for this report.  There is however an assignment to be completed - the assignment is actually good practise for your exam - I strongly recommend you complete it!
Week 13
Apr. 25



Final test (no lecture and no lab) - Good Luck!


  This page maintained by Bill Kapralos
Last modified:  Tuesday, April 18 2006