\
   

 

Lectures and Lecture Schedule

Here you will find information pertaining to each weeks lecture.  In particular, the material you are responsible for (e.g., topic(s) to be covered, sections in the textbook, papers and additional notes), the powerpoint presentation and any relevant notes or comments.  I will aim to post the presentation (in pdf format) about one day prior to the lecture although there will be no guarantee.  Lecture notes posted prior to the lecture will be preliminary however, after the lecture, an updated presentation will be posted (depending on the lecture, I may modify the preliminary version slightly and although I will do my best to eliminate them, preliminary notes may contain minor errors). 

Date
Main Topic(s)
Textbook Sections
Powerpoint Slides
Notes/Comments
Week 1 Sept. 12
Introduction to image processing and examples of fields that use image processing. components of an image processing system  Chapter 1 (complete)
Final notes
 3Slides/page
 4Slides/page

After an introduction to the "administrative" details regarding the course (e.g., course outline etc.), this lecture will begin with an introduction to the field of digital image processing.  Terminology will be introduced including a definition of an image and digital image processing followed by a brief discussion on the many uses of digital processing and how it has impacted our lives.

Here is some further info. just for your own interest

Week 2 Sept. 19
Introduction to visual perception.  The electromagnetic (EM) spectrum.  Image acquisition, sampling and quantization.
Chapter 2:
2.1, 2.1.1, 2.1.2, 2.1.3, 2.2, 2.3, 2.3.1, 2.3.2, 2.3.3, 2.3.4
, 2.4, 2.4.1, 2.4.2

Final notes
 3Slides/page
 4Slides/page
The first half of this lecture will begin with an introduction to the human visual system followed by a discussion of the electromagnetic spectrum (in greater detail than the introduction given during last week's lecture's).  Both of these topics on their own are extremely large and we can spend an entire course on them.  This lecture will simply introduce some fundamental concepts (terminology etc.) as required for digital image processing.  In the second half of the lecture, we will focus on image acquisition, sampling and quantization.  This topic should be somewhat of a review as you have covered the concepts in the Digital Signal Processing course for the 1-D case (e.g., 1-D signals).  here we will be concerned with 2-D signals.  This topic  includes a discussion on the various types of sensor arrangements used to sample an image (e.g., single sensor, 1D sensor array and 2D sensor arrays common in most CCD based digital cameras) in the spatial domain followed by the the methods used to quantize an image (e.g., sampling of an image with respect to intensity or gray-level).  A brief introduction to potential problems that sampling can lead to (e.g., aliasing) will be introduced although greater emphasis on this topic will be placed in future lectures.

Here are some links/references for your own interest.

Lab:
This week's lab: Lab 1.  You can ignore "Procedure 3" (last page of the lab).  There is no report required for this lab report however, you must complete the "Exercise" portion of the lab during the lab period and show the instructor prior to signing off.  In addition, the following assignment is to be completed and submitted at the beginning of the lecture (e.g., 6:05pm), the  following week (Monday, September 26 2005) - see also "Assignments" page.
Week 3 Sept. 26
Image acquisition, sampling and quantization (continued from last week).  Basic relationships between pixels
Chapter Two:
2.4.3, 2.4.5, 2.5, 2.5.1, 2.5.2, 2.5.3, 2.5.4, 2.6

Final notes
 3Slides/page
 4Slides/page



In the first half of this lecture, we will continue our discussion on image sampling and acquisition.  In particular, we will discuss spatial and gray-level  resolution, aliasing and a brief introduction to image up-sampling and down-sampling (image shrinking and zooming).  In the second half of the lecture, we will examine several basic relationships amongst pixels.  The lecture will end with a discussion of linear and non-linear operators.  Some examples on the board will follow.

Lab:
This week's lab: Lab 2.  In this lab, both LabView and IMAQ will be used. You will also require a digital camera and related equipment.   A report must be submitted with this lab.  In addition, there is also an assignment to accompany the lab:  Questions (Chapter 2) 2.2, 2.14 and 2.19 from the Gonzalez and Woods textbook.

The lab report and assignment is due the following week (Monday, October 3 2005)at the start of the lab - see also "Assignments" page.
Week 4 Oct. 3
Linear and non-linear operations. Image enhancement in the spatial domain Basic gray-level transformations (image negatives, log transforms, power and piece-wise transforms).  Histogram processing
Chapter Two:
2.6

Chapter Three
3.1, 3.2, 3.2.1, 3.2.2, 3.2.3, 3.2.4,
Final notes
 3Slides/page
 4Slides/page
This lecture will begin by continuing our discussion regarding linear and non-linear operators that we began discussing during last week's lecture.  Following this, image enhancement in the spatial domain will be introduced. Spatial domain image enhancement refers to modifying the image in some manner via operations performed directly on the pixels (e.g., intensity values) themselves.  The mathematical definition of an image operator will be introduced and several common image enhancement operators will be covered.   Finally, image histograms will be introduced.

Here are some links/references for your own interest.
Lab:
This week's lab: Lab 3.  In this lab, both LabView and IMAQ will be used. You will also require a digital camera and related equipment.  No lab report is required for this lab although there is an assignment due October 17, 2005.
Week 5 Oct. 10





Thanksgiving holiday - no lecture!
Week 6 Oct. 17
Image enhancement in the spatial domain:  Histograms and histogram processing, Arithmetic operators, Basics of spatial filtering, Smoothing spatial filters
Chapter Three
3.3 (up to page 90), 3.4, 3.4.1, 3.5, 3.6, 3.6.1
Final notes
 3Slides/page
 4Slides/page

The first part of this lecture will continue with the introduction to histograms and histogram processing that we started during the end of the last lecture (Oct. 3).  We will then proceed to discuss arithmetic operations (e.g., addition, subtraction etc. that I briefly discussed during the Oct. 3 lecture). 

The second part of this lecture will introduce the concept of filtering an image in the spatial domain.  In particular, we will discuss the "mechanics" of filtering an image with a filter (the type of filter depends on the application however, the "mechanics" remain the same) in the spatial domain (e.g., by directly manipulating the image's pixel gray levels).  A filter is also known as a template, kernel, mask among other names.  We briefly discussed the concept of a template (mask etc. during the Oct. 3 lecture.  We will build upon the concepts introduced there.  Once we review the concepts introduced previously, we will go into further depth.  Once we have familiarized ourselves with spatial filtering using a kernel, an application of it (averaging/blurring) will be introduced.  This is actually a very important topic and it is highly recommended you read over the lecture notes and the appropriate sections in the book very carefully! 

Additional material relevant to the lecture:
  • Try some simple examples of spatial filtering on your own.  Begin by computing the double summation convolution formula introduced in class for a 3 x 3 mask with all coefficients equal to 1, then try a 5 x 5 mask.  The material provided in the notes should be sufficient for you to do this (but of course you also have the book to look at as well)!
Lab:
This week's lab: Lab 4.  In this lab, IMAQ will be used along with a  digital camera and related equipment.  There is no lab report is required for this lab however, there is an assignment due October 31, 2005.  The assignment consists of the following questions from the textbook:  Chapter 3: 3.1, 3.12, 3.13
Week 7 Oct. 24
Review of the basics of spatial filtering, Smoothing spatial filters, Sharpening spatial filters
Chapter Three 3.6, 3.6.1, 3.6.2, 3.7, 3.7.1 (up to page 125) ,  3.7, 3.7.1, 3.7.3, 3.8

Final notes
 3Slides/page
 4Slides/page
This lecture will include a brief review of the "mechanics" of spatial filtering, followed by a discussion on how we can use specific kernels to perform various operations on an image by using the "mechanics" of spatial filtering.  In particular, we will examine smoothing spatial filters which are used to remove noise from an image (and also "blur" an image), followed by sharpening spatial filters which are used to "sharpen" an image and "highlight"sharp transitions between intensity values (e.g.,edges).

Lab:
This week's lab: Lab 5.  In this lab, IMAQ and LabView  will be used along with a  digital camera and related equipment.  There is a lab report is required for this lab however, there is NO assignment.  The lab report is due Oct. 31, 2005.  The lab itself introduces material which we may not necessarily cover in the lectures and some material which we will cover later on.  However, the lab should be fun and interesting regardless and should be easy to follow.
Week 8 Oct. 31
Sharpening spatial filters, Introduction to edges, Introduction to the first and second order derivatives, Combining spatial filtering techniques
Chapter Three 3.6, 3.6.1, 3.6.2, 3.7, 3.7.1 (up to page 125) ,  3.7, 3.7.1, 3.7.3, 3.8 Final notes
 3Slides/page
 4Slides/page

Mid-term review material
In this lecture we will continue our discussion of sharpening spatial filters.  This discussion will then provide a brief introduction to edges and how edges can be modeled followed by a discussion on the first order digital derivative.  We will then focus on the second order derivative and how it can be modeled.  Both the first and second order derivatives care used to detect edges.  Finally, the lecture will conclude with a brief discussion on how both smoothing and sharpening filters can be combined.  Examples applications will be provided. 

Lab:
There is no lab this week however, there will be a review for the mid-term test during the first half of the lab period.  An overview of the material you are responsible for is provided and we will briefly go over this during the review.  Finally, the review is optional and you do not have to attend (e.g., attendance will not be taken) although it is recommended you do!
Week 9 Nov. 7




Mid-term test  No lecture.
Week 10 Nov. 14
Second order derivative, the Laplacian, Introduction to the Fourier transform
Chapter Three
3.7, 3.7.1, 3.7.3, 3.8

Chapter Four
4.1, 4.2, 4.2.1
Final notes
 3Slides/page
 4Slides/page

Background (from Richard G. Lyons Understanding Digital Signal Processing book):

The Arithmetic of Complex Numbers

1D discrete Fourier transform example
During the first part of this lecture we will finish off our discussion of image enhancement in the spatial domain.  We will review the second order digital derivative and introduce the Laplacian operator.  Finally, we will examine the combining of spatial enhancement techniques.  The second part of this lecture will focus on the Fourier transform. In particular, we will begin with some background to the Fourier transform followed by an introduction to the one dimensional Fourier transform and some of its properties.  Although we are interested in the the two-dimensional Fourier transform in this course, it can be generalized from the one-dimensional Fourier transform hence we will begin with the one-dimensional case.  I suggest you take a look at the notes I have added from Richard Lyons book on Arithmetic of complex numbers and the 1D Fourier example.

Finally, the following links are provided for your interest
  • More about Jean Baptiste Joseph Fourier (a biography)
  • For some fun, download and listen to "Fourier's Song" (mp3) by Dr. Time and Brother Fre(quency) (well, in reality by Dr. Robert Williamson of the Australian National University).  Link includes lyrics to the song - its actually quite amusing!
Lab:
We will be working on lab 6 this week.  The lab may span two weeks depending on how far we get. This lab deals primarily with Matlab and it is recommended you read the lab prior to the lab period.  The lab also makes use of the following image: lenna.jpg (this image is actually one of the most popular images in the computer vision/image processing fields - it dates back over 25-30 years!)
Week 11
Nov. 21
Brief introduction to the  1D Fourier transform and its properties, Introduction to the 2D Fourier transform and its properties, Introduction to Filtering in the Fourier domain
Chapter Three
3.7, 3.7.1, 3.7.3, 3.8

Chapter Four
4.1, 4.2, 4.2.1
Prelim notes
 3Slides/page
 4Slides/page

Background (from Richard G. Lyons Understanding Digital Signal Processing book):

The Arithmetic of Complex Numbers

1D discrete Fourier transform example
In this lecture we will continue our discussion on the 1D Fourier transform that we started to discuss last week.  This includes a discussion regarding some of the properties of the 1D Fourier transform as well.  The lecture will then focus on the 2D Fourier transform.  In particular, after an introduction to the 1D Fourier transform, some properties of its properties will be discussed followed by some examples.  Time permitting, we will look at filtering of images in the frequency domain.  Once again, I suggest you take a look at the notes I have added from Richard Lyons book on Arithmetic of complex numbers and the 1D Fourier example.

Lab:

We will be continue working on lab 6 this week. 
Just a reminder that you can actually work on this lab on your own, during your own time since it does not require any camera/equipment and Labview/IMAQ Vision (of course it does require Matlab but Matlab is installed on machines throughout several accessible labs.  I recommend you do work on this lab outside of the regularly scheduled lab hours.  The lab also makes use of the following image: lenna.jpg (this image is actually one of the most popular images in the computer vision/image processing fields - it dates back over 25-30 years!)
Week 12
Nov. 28
Continue with our discussion of the 2D Fourier transform, Introduction to filtering in the frequency domain, Properties of the frequency domain, Convolution Theorem, Gaussian filters
Chapter Four
4.1, 4.2, 4.2.1, 4.2.2, 4.2.3, 4.2.4
Final notes
 3Slides/page
 4Slides/page

Background (from Richard G. Lyons Understanding Digital Signal Processing book):

The Arithmetic of Complex Numbers

1D discrete Fourier transform example
In this lecture we will continue our discussion of the 2D Fourier transform followed by a discussion of filtering the frequency domain, where we will examine both low and high pass filters in detail.  We will then discuss the Gaussian filter followed by a discussion on the Convolution Theorem (this is very important!)  I would also like to review the "1D discrete Fourier transform example" from Lyon's book.

Lab:
We will be continue working on lab 6 this week. 
I anticipate you should complete the lab within the first half of the lab period.  After, we will start with Lab 7.   Lab 7 examines edge detection using IMAQ Vision Builder.  No camera and equipment are required for this lab.  This lab should also be fairly straightforward to complete.  A lab report is required for this lab.

Reminder: No lab report required for Lab 6.

A further reminder (summary) regarding the DFT:

Given a 1D input sequence x[n] of size M (e.g., M samples), after performing a DFT operation on the input sequence, we obtain our output DFT sequence X[m] also of size M.

  • x denotes the discrete input signal.
  • X denotes the discrete DFT output.
  • The size (number of samples) in our input sequence is equal to the number of samples in our output sequence (e.g., equal to M for both).  In other words, the size of the input signal (sequence) determines the size of our output signal (sequence).
  • M denotes the size of our input signal (e.g., number of input samples) and the size of the output DFT signal (e.g., number of DFT samples).  Remember, input sequence and output sequence are discrete!
  • n denotes the index into our input signal (e.g., x[n] is the nth input sample).
  • m denotes the index into our DFT output signal (e.g., X[m] = mth DFT output).
  • Keep in mind, m and n are simply indices - any letter can be used to denote an index but we will use these two as defined above for consistency.

Week 13 Dec. 5
Smoothing and sharpening frequency domain filters,  Discontinuity detection and Image segmentation
Chapter 10 10.1, 10.1.1, 10.1.2, 10.1.3, 10.3.3, 10.3.4, 10.4, 10.4.1, 10.4.2, 10.4.3 Prelim notes
 3Slides/page
 4Slides/page
In this lecture we will conclude our discussion on frequency domain filtering by examining smoothing and sharpening filters.  The remainder of the lecture will then be on discontinuity detection and image segmentation.  In particular, we will examine point and line detection in addition to applying a threshold to an image in order to detect objects within an image.

Lab:
We will be continue working on lab 8 this week. 
This is the last lab of the term and should be straightforward to complete.  You will require the use of a camera and camera equipment.  No lab report required for this report.
Week 14
Dec. 12
Eyes 'n Ears: A System for Attentive Teleconferencing and Remote Distance Learning


Example "real-life" computer vision/image processing system that is used to detect faces and hand-raising gestures in a sequence of images.  Further details regarding the system can be found in the following paper:
Review:
The second half of the lecture will be a review in preparation of your final exam.  We will cover the 1D Fourier transform example available from here.  The review will also include a brief overview of sections you are responsible for in addition to taking any of your questions.
Week 15
Dec. 19



Final test (no lecture) - Good Luck!


  This page maintained by Bill Kapralos
Last modified:  Thursday, December 8 2005