
Event Detail
- Start Date 07/24/2020
- Start Time 11:00
- End Date 08/03/2020
- End Time 03:00
- Location New York
Microcontroller is a term used to describe a system that includes a minimum of microprocessor, program memory, data memory and input-output (I/O). Some microcontroller systems also include timers, counters, analog to digital (A/D) converters and so on. The chapter outlines the concepts, terminologies and working of microcontroller systems and introduces programming and system design using programmable interface controller (PIC) series of microcontrollers manufactured by Microchip Technology Inc. A microcontroller is different from a microprocessor. It is basically a single chip computer used or embedded with other devices or equipment for control functions and is also called embedded controller. The working of microcontroller is explained with block diagrams for the reader to understand the I/O functions fully. The different types of memory used in microcontrollers are described and the hardware features are explained in general. Similarly, the microcontroller architectures are explained with figures to benefit the reader. The chapter has an exhaustive coverage of number systems used in microprocessors and their conversion using arithmetic operations. Floating point numbers and arithmetic are available in detail with many examples and the conversion equivalents summarized in a table.
All microcontrollers require a clock (or an oscillator) to operate. The clock is usually provided by connecting external timing devices to the microcontroller. Most microcontrollers will generate clock signals when a crystal and two small capacitors are connected. Some will operate with resonators or external resistor–capacitor pairs. Some microcontrollers have built-in timing circuits, and they do not require any external timing components. If your application is not time-sensitive, you should use external or internal (if available) resistor–capacitor timing components for simplicity and low cost.
An instruction is executed by fetching it from the memory and then decoding it. This usually takes several clock cycles and is known as the instruction cycle.