Sunday, 30 November 2014

Week 11 and Week 12

In week 11, we learnt a little about the history of algorithms involving two individuals, Alonzo Church and Alan Turing. I found it very interesting to know that they showed that some problems can't be solved by algorithms, anyone or computers. I was surprised, as I thought that computers could solve even the most complicated math or logical problems. As well, I was surprised to know that computer programming languages today can solve as many problems as the old Turing machine. This is because I know that languages and machines have become far more advanced now when compared to the past.

We also learnt the concept of halt, which is a non-computable function that takes an input and returns an output, but we don't know how. As well, a function can halt or not halt, depending on its condition. When it doesn't halt, it enters an infinite loop. The professor also taught us a function which cannot be implemented; it is called the equalizer. This is the code:

                                 def H(f, i):
                                        ''' Return True if f(i) will halt, False otherwise.'''

                                        # Once you've figured out how to implement this, delete next line
                                        return True

As seen, it looks simple but is virtually impossible.

We also learnt about reduction, which is where a program predicting whether x is initialized is written. The general statement for this is:     g computable  ===>   f computable

Furthermore, partly-working halts, other uncomputable functions, and the way to count finite sets were other concepts covered in lecture.

Infinity doesn't equal infinity: this was a very interesting concept for me to learn as I initially thought that infinity always had to equal itself. But in lecture, I was told that I had to think of it in terms of different extremely big values that are approaching infinity. For example, 2**10000 and 7**1000 are not the same and yet, they both approach infinity.

Also, the professor highlighted the concept of counting by 1-1 and onto, and how a set of natural numbers has the same size as its subset, the even natural numbers. One more interesting thing that I found was that diagonalization proved that not all sets are countable.

In our last week of classes, we learnt additional concepts ranging from Cantor's example to writing an inductive hypothesis. One concept that I found interesting this week was the principle of simple induction, where if something is a predicate of natural numbers, that is true for all natural numbers. This is so, provided that if the predicate's first value is true. This way, the truth value is passed from each value to the next. As well, when the notation is given like this:
"Î ℕ, n k Þ P(n)

it means that P is true for all natural numbers greater than or equal to k

On Friday, the professor conducted another activity involving a rectangular grid with a line passing through it diagonally and shaded squares. The shaded squares were because the line passed through those squares. We had to use various techniques to try and derive the formula for this pattern. It was difficult and I didn't succeed, but I enjoyed doing it very much. This is because this allowed me a chance to apply my critical thinking skills once again in the course.

Personally, the concepts were generally moderate in terms of difficulty. This is why I asked my professor questions during lecture, and he clarified my doubts very well. This is what I intend to do in addition to rereading if I don't understand the lecture material, as it is part of seeking help from others. 

Friday, 28 November 2014

Week 9 and Week 10

These two weeks have been intensive with concepts, ranging from proving more quadratic functions to Big-Omega and Big-Theta. By far, I have seen that the course is progressively more difficult. So, I know that I will have to spend more time rereading my lecture notes and trying to understand the logic hidden in the various problems involved in mathematical reasoning.

One concept I learnt during Week 9 is that of a loop guard. A loop guard is just a step that a part of code takes to move a function along and help it execute. For example, in code, an initialization statement for a variable is a loop guard, as it is run through only once. An example of this is max = 0 or i = 0. I found the proofs about quadratic functions to be quite challenging, so I decided to include the proof for one of the functions.

7n**6 - 5n**4 + 2n**3 Î O(6n**8 - 4n**5 + n**2), it uses the O(n**2) notation to prove it.

Here is the proof:

Choose c = 9/2. Then cÎℝ+.
Choose B = 0. Then BÎ
Assume nΠ# typical 
      Then n >= B  # antecedent
             Then 7n**6 - 5n**4 + 2n**3 <= 7n**6 + 2n**3             # 5n**4 <= 0
                                                                        # add 7n**6 + 2n**3 to both sides
                      <= 7n**6 + 2n**6           # 2n**3 <= 2n**6 
                                                               # "nÎ
                      = 9n**6 <= 9n**8         # 9n**3 <= 9n**6 x n** 2    "nÎ
                         = c2n**8             # c = 9/2
                      = c(6n**8 - 4n**8)        # -4n**8 <= -4n**5
                      <= c(6n**8 - 4n**5)   <= c(6n**8 - 4n**5 + n**2)
                        # add 6n**8 - 4n**5 to both sides of 0 <= n**2
              f(n) <= c(6n**8 - 4n**5 + n**2)
       Then n >= B Þ f(n) <= c(6n**8 - 4n**5 + n**2) 
Then "nÎ, "                      # introduce "n                           "
Then $BÎ, "                  # introduce $B                               "                 
Then $cÎℝ+, "                                                                       "

------------------------------------------------------------------------------------------------------------------------
This is one of the various proofs that we were and are taught to write for quadratic functions. It is a lengthy process, and it is also confusing. It even confused me as there were so many steps that seemed like they were arbitrarily thought of and included. One of these steps was the changing the degree of a variable so that it will be higher. This is the reason why I know that rereading these proofs and understanding them is important. Furthermore, I know that something like this will appear on the exam, so this is necessary to know, not just important.

In the 10th week, we were taught about the role of logarithmic and exponential functions in proofs. We were also taught how L'Hopital's rule applied into proofs, and I find it amazing that mathematical equations such as these can be expressed through proofs. However, I find it equally frustrating that I do not understand how the proof progresses. I also wonder how quite a lot of people in my lecture session are able to understand this concept and offer the next step to the proofs' solutions. Knowing that this is the first time that we've seen these kinds of proof statements, I wonder how they understand what values to use for variables, for example, and I don't. 

Additionally, we were taught the notation for Big-Omega, (g). It is almost the same as the notation for Big-Oh, except that here, f(n) >= cg(n). In Big-Oh, f(n) <= cn**2. We were also taught about the bound that was formed by combining Big-Oh and Big-Omega. This bound has the symbol QAs usual, some proofs related to these concepts were taught to us, which looked fairly reasonable for me to understand. However, if I was given a similar question like this, I would most likely not be able to do it. This is why I am rereading my notes and trying to improve my understanding in the course material. 

So far, I have been challenged and intimidated by the course material, despite my efforts to makes sense of the material. As well, I am apprehensive of the material included on the final exam, as well as the level of difficulty of the exam.

Saturday, 1 November 2014

Week 7 and 8

In these two weeks, my professor taught us more concepts related to proofs and even the concept of Big-Oh. Firstly, he showed us how to make sure that a claim is true or false before proving so. He also introduced us to some rules of inferences and how they work, some of which we had already covered in detail. These inferences are:

1. Conjunction elimination, where Ù B, conclude A or B separately
2. Existential instantiation, where $keX, P(k), pick an element with this property, and let k'eX, P(k')
3. Disjunction elimination, where Ú B, ØA can conclude B
4. Implication elimination, where Þ B, A can conclude B and ØB can conclude ØA
5. Universal elimination, where "xeX, P(x), aeX can conclude P(a)
6. Implication introduction, where Þ B 
7. Universal introduction, where "aeD, P(a)
8. Existential introduction, where $xeX, P(x)
9. Conjunction introduction, where Ù B (if you know A and B)
10. Disjunction introduction, where Ú B (if you know A)

Furthermore, my professor also mentioned something about sorting algorithms and speed. If I remember correctly, he mentioned that the speed does not depend on the input and hardware the algorithm is running on. He also conducted a "penny piles" activity last Friday, where we were given a situation in which a number of pennies are in either of two drawers. We had to sort them in even numbers according to which drawer they were in. This activity seemed complicated at first, but I slowly came to understand it and had fun doing it, even though I couldn't get the final answer.

Some more concepts that my professor covered in lecture was the running speed of algorithms, and that they could be measured by counting how many steps it takes to reach the output. So basically:
length of algorithms = no. of steps

My professor also showed us how to establish formulas for the number of steps taken to complete a function based on an input. The class also learnt that there are two worst cases: the upper bound and the lower bound. The upper bound means that the program will take no more than a certain number of steps to complete the program. The lower bound means that the program's worst case contains at least a certain number of steps. Upper bound is denoted by: We O(U), and lower bound is denoted by: W(L)O is for oh, and Ω is for omega, U is for upper bound, and L is for lower bound. Additionally, we learnt the concept of bounding a sort so that We O(n to the power of 2) is proven, for example.

As mentioned before, we also learnt the concept of Big Oh of n to the power of 2. This function means that all quadratic functions are roughly the same speed and that the natural numbers are the input and non-negative real numbers are the output. Lastly, we were shown how to prove We O(n to the power of 2) and W(n to the power of 2).

Overall, the concepts for the first week were relatively easy for me to understand, but the second week's concepts were substantially more challenging. I understood part of it, but if I were given another problem related to it, I would most likely not be able to solve it or write a proof for it. I know that it takes time to understand and absorb the concept of Big Oh, so this is why I am reading over my lecture notes to understand it better. Additionally, if I were to be given an expression and asked to prove it, I would have some difficulty doing so. This proves that I need to work on application of my learnt concepts. By far, this course has continued to be challenging and I know that one can't succeed without practice and hard work. To overcome this difficulty, I aim to continue to read my lecture notes over and over again, do certain lecture problems without looking, and seek help if in doubt.