Friday, 23 November 2012

3.1.3 Finite State Machines

A Finite State Machine (FSM) is a machine which accepts an input and proccess an output.

There a 3 types of FSMs a Mealy Machine, a Moore Machine and a Finite State Automata.

A Finite State Automata (FSA) is a FSM which has a finishing state and only outputs a YES/NO answer based on which state the input string ends on.

A Mealy Machine is a FSM where the output is on the transistions opposed to the states.

A Moore Machine is a FSM where the output is on the states opposed to the transisitons.

A FSM/FSA can be represented in a variety of ways the most common is a State transition Diagram.

This is an example of a FSA as shown by a State Transition Diagram. The arrow with no number above it shows the Starting State. And the yellow circle around State 2 indicates that it's the state which will give an output of YES.
 A FSM can also be represented by a Transition Table.

Current State:    S1   S1   S2   S2   S3   S3
            Input:      0      1     0     1     0     1
   Next State:     S2    S3   S3   S3   S1   S1
         Output:      A      B    C     B    C     B

This table shows that in State 1 if you input 0 the FSM would be A and the next State would be Sate 2.


The third way to represent a FSM is as a Transition Function. These have the standard formula of (Input Symbol, Current State) -> (Output Symbol, Next State). For example (1, S3) -> (A, S2) would mean that if you are currently in State 3 and if you input 1 the FSM will output A and move to State 2.

Finally there are 2 types of FSM/FSA, Deterministic and Non-Deterministic. Deterministic means there is only 1 transition for each input while Non-Deterministic means there's more than one possible transition per input allowing for more than 1 route round a State Transition Diagram.

Oh and also a Halting State is a state with no outputs.

Friday, 16 November 2012

3.1.2 Comparing Algorithms

Algorithms has varying levels of complexity. This covers 6, Constant, Linear, Logarithmic, Polynomial, Exponential and Factorial.


A graph showing a visual representation of the various complexitiesin relations to each other.

Constant Complexity (also known as O(1)) is a problem where the time taken doesn't grow even if the input size varies. For example an algorithm that finds out if a number is even or odd is O(1) complexity as no matter how large the number is it, the time taken is the same. O(1) complexity doesn't have to be just a single line, ordering the first 15 records of a file for example will take the same amount of time if the file is 15 records long or 150 records long.

Linear Complexity (O(n)) is a problem which will take proportionally more time as the input gets bigger. For example ordering a file, ordering a file with 10 records will be about 5 times faster than ordering a file with 50 records.

Logarithmic Complexity (O(log n)) is a problem which will take longer the larger the input is but not to the extent of O(n). A binary search is a good example as the longer the list is the longer it will take to search but not as long as a Linear Search would take.

Polynomial Complexity (O(nk)) is a problem that takes longer the larger the input is and the increase in time is faster than the increase in n. Polynomial Complexity includes Quadratic Complexity O(n2) and Cubic Complexity O(n3). An example of O(n2) would be shaking hands with everyone in a room. With 2 people only 1 is needed but with 7 people 21 are needed.

Exponential Complexity (O(kn)) are problems where the solution can only be found for small values of n, an example would be the "The Chinese Emperor and The Warlord" the payment picked was the chessboard method which 1 grain of rice was placed on the first square, 2 on second, then 4 on the third etc. As the number of squares increase the time taken to calculate the grains of rice increases exponentially. For small inputs it can be worked out but for larger and larger inputs it can potentially take longer than the Earth is around for

Factorial complexity (O(n!)) are problems that can't be solved in a reasonable amount of time. An example would be the Travelling Salesman Problem, this is still being looked into at how to solve it the most efficiently, however it can be solved by O(kn) if Dynamic Programming is used however that still means that only the smallest versions of the problem can be solved, Quantum Computing will be able to help with these sorts of problems but that's still a way off as it's still in its infancy.

Monday, 12 November 2012

3.1.1 Infomation Hiding and Abstraction

Information Hiding: Hiding the complexity of a system to simply the experience for the user.

Abstraction: The process of removing unnecessary details in the representation of a system.

To Abtract you Hide Information to create an Abstraction.

Levels of Abstraction
Application Level: Hides complex process(es) behind an icon.
High Level Programming Language: Hides low levels commands behind a single high level one eg. IF
Machine Code: Hides complex electronics behind a command eg. ADD
Electronics: Hides the physical atomic interactions behind electronic components.