Sunday 2 December 2012

DFSA? NFSA? Regular Expression! (Part II)

So as we went over the basic on the previous post. I'll have something that's more challenging that'll take more effort to tackle it.

Give a DFA for each language below.

    (a) L_1 = { s (- {0,1}* : s contains at least 2 characters and s's
            second character is a 1 }

        We only need to keep track of which character we are processing for
        the first 2 characters, to ensure the second one is a 1.  Using the
        ASCII conventions from lecture (parentheses around accepting
        states, self-loops indicated by labels next to states, dead states
        not shown), we get the following DFA:

                           0,1        1
            --> q_0 ----> q_1 ----> (q_2) 0,1

        (Note: this has implicit dead state q_3.)

    (b) L_2 = { s (- {0,1}* : s contains fewer than 2 characters }

        Note that L_2 = { \epsilon, 0, 1 }.

                            0,1
            --> (q_0) ----> (q_1)

        (Note: this has implicit dead state q_2.)

    (c) L_4 = { s (- {a,b}* : every a in s is _eventually_ followed by b }
        For example, aaab (- L_4 because there is a 'b' that follows every
        'a'-- even though it is not immediately after the first two 'a's.

        Note that L_4 = { every string that does *not* end with an 'a' }.

                                 a
                        b      --->     a
            -->  (q_0)            ( q_1)
                               <---
                                  b

Sunday 25 November 2012

DFSA? NFSA? Regular Expression!

I've found the question on DFSA and NFSA quite interesting. It seems straightforward to me yet when it comes doing the drawing it becomes a different story. I managed to find the question about DFSA and NFSA from the old 236 course so we'll go over them today.

I realized it'll be nice to have some basics explained here. First there's a more introductory question that I found in the past tutorial paper in the CSC236 course website.

 
List the elements of {a,ab}*.  Find a way to describe strings in this language, i.e., find a predicate P(s) such that P(s) is true iff s in {a,ab}*, for all strings s over alphabet {a,b,c}.

    Q:  What's {a,ab}*?
    A:  {a,ab}* = {e} U {a,ab} U {aa,aab,aba,abab} U {aaa,...} U ...
                = {e,a,ab,aa,aab,aba,abab,aaa,aaab,aaba,aabab,...}

    Q:  How can we describe strings in {a,ab}*?
    A:  This is almost like {a,b}* (all strings of a's and b's), except
        each 'b' comes with an 'a' in front, i.e., P(s) = s is a string
        of a's and b's where each b is immediately preceded by an a.

    Q:  We're done, right?
    A:  Almost: we should check that s (- {a,ab}* iff P(s), by arguing
        each direction:
        =>  It is obvious that each string in {a,ab}* has property P()
            (every b immediately preceded by an a).
        <=  Moreover, each string where each b is immediately preceded
            by an a belongs to {a,ab}*: the string can be broken up
            into pieces 'ab' (one for each b in the string) with every
            other symbol being 'a'.

Tuesday 6 November 2012

Time complexity & Master Theorem & Repeated substitution

Time complexity and repeated substitution are the materials I found challenging in this course. It was first introduced to me last year in CSC165 as big O and big Omega. At below are some example questions I found that were quite challenging to me. I'll hence go over them in a detailed way.

Consider the following recurrence relation.  


        T(n) =   { 2                                     if n = 0, 1
                     { T(floor(2n/3)) + lg(n)     if n > 1

    Find upper and lower bounds on the value of T(n), using the Master
    Theorem.  Then try to find a tight bound.

 * This is a trick question, because the Master Theorem only applies to
    recurrences where the extra work performed outside the recursive calls
    takes time \Theta(n^d) for some constant d.  

    Nevertheless, we can use the Master Theorem to obtain separate upper
    and lower bounds on the value of T(n).  Consider the following
    recurrence relations.

               
        L(n) =        { 2                     if n = 0, 1
                          { L(floor(2n/3)) + 1    if n > 1

                             
        U(n) =      { 2                               if n = 0, 1  
                         { U(floor(2n/3)) + n    if n > 1

    Since 1 <= lg(n) <= n for all n > 1, it is easy to see that for all n,
    L(n) <= T(n) <= U(n).

    Applying the Master Theorem to L(n), we get a = 1, b = 3/2 and d = 0 so
    a = b^d and L(n) (- \Theta(n^d log n) = \Theta(log n).
    Hence, T(n) (- \Omega(log n).
    In this case, it would be trivial to prove T(n) >= lg(n) by induction on
    n >= 1-- it may seem that the inductive hypothesis is not necessary, but
    it is to ensure that T(floor(2n/3)) is not negative.
    { Go over this proof at the end of you have time left-- but don't worry
      about it if you don't. }

    Applying the Master Theorem to U(n), we get a = 1, b = 3/2 and d = 1 so
    a < b^d and U(n) (- \Theta(n^d) = \Theta(n).
    Hence, T(n) (- O(n).
    Again, it is trivial to prove T(n) <= 2 n by induction on n >= 1--
    making use of the fact that lg(n) <= n for all n >= 1.
    { Go over this proof at the end of you have time left -- but don't worry
      about it if you don't. }

    Unfortunately, we have reached the limit of what can be learned by
    applying the Master Theorem-- there is no way to make the bounds any
    tighter.
    { Well, that's not quite true: if we remember that lg(n) is o(n^c) for
      any realy number c > 0 (from calculus), then we can prove that T(n) is
      O(n^c) for any real number c > 0.  But that still leaves a gap between
      the upper and lower bounds. }

    In order to get a tight bound, we need to use something like the
    repeated substitution method.  We will need to use the following facts
    about logarithms-- it is assumed that you are aware of these properties,
    and you are allowed to make use of them in your own work.  For all real
    numbers x, y > 0, and all bases b > 1:
      - log_b(x/y) = log_b(x) - log_b(y)
      - log_b(xy) = log_b(x) + log_b(y)
      - log_b(x^y) = y log_b(x)

    Repeated substitution for T(n):

        T(n)     =     T(2n/3) + lg(n)
                    =     T(2(2n/3)/3) + lg(2n/3) + lg(n)
                    =     T((2/3)^2 n) + 2 lg(n) + lg(2/3)
                    =     T((2/3)^3 n) + lg((2/3)^2 n) + 2 lg(n) + lg(2/3)
                    =     T((2/3)^3 n) + 3 lg(n) + 3 lg(2/3)
                    =     T((2/3)^4 n) + lg((2/3)^3 n) + 3 lg(n) + 3 lg(2/3)
                    =     T((2/3)^4 n) + 4 lg(n) + 6 lg(2/3)

        after i substitutions,

                   =     T((2/3)^i n) + i lg(n) + (i (i-1) / 2) lg(2/3)

        T( ) "disappears" from the right-hand side when (2/3)^i n = 1, i.e.,
        i = log_{2/3}(1/n) = log_{3/2} n.  For this value of i,

        T(n) = T(1) + log_{3/2} n lg(n)
                 + (log_{3/2}(n) (log_{3/2}(n) - 1) / 2) lg(2/3)

    Based on this, we expect that T(n) (- \Theta(log^2 n)

Sunday 14 October 2012

Term Test 1 Question 3

Hello fellows, today I'm going to tackle the question regarding structural induction. It is just something that I couldn't put my finger on at the first glance. I found this hard at the term test 1 so I'll take a few step down to look at the structural induction

So here's an explanation I found in the course notes:
Provided a recursive definition of a set, we naturally use structural induction to prove properties of its elements by using induction.
Suppose X is a defined set using induction and we want to prove that every element of X has a certain property P. A proof by structural induction proceeds in two steps.

Basis: We prove that every "smallest" or "simplest" element of X satisfy property P

Induction Step: We also prove that the larger elements that are made of these"smallest" or "simplest" elements satisfy property P

I didn't get the third question of term test 1 right so I'll take a few step to look into it.

Here's the question:
Define epsilon as the smallest set such that
1. Symbols x,y,z are elements of epsilon.
2. If expressions e1 and e2 are elements of epsilon, then so is (e1 + e2)
For e in epsilon, define vr(e) as the number of occurrences of symbols from the set {x,y,z} in e, and p(e) as the number o occurrences of parentheses from the set  {(,)} in e. For example, vr(((x+y)+z)) = 3 and p(((x+y)+z) = 4. Use Structural Induction to prove that for every e in epsilon, 2vr(e) = p(e) + 2

So as all induction proof, we start by defining the predicate: P(e): "2vr(e) = p(e) + 2"

First step would be looking for the basis

Basis: e in {x,y,z}: Each of these expressions has exactly one variable and 0 parentheses, so according to P(e) we have 2 * 1 = 0 + 2 = 2, so we know P(e) holds for e in the basis.

Secondly we show the induction step

Induction: Assume e1 and e2 are in epsilon and P(e1), P(e2)  [P(e1) and P(e2) have to hold in order to prove P((e1 + e2))

So 2vr((e1 + e2) =  2(vr(e1) + vr(e2))    #just rearrange it
                           =  2vr(e1) + 2vr(e2)
                           =  p(e1) + 2 + p(e2) + 2  #left hand side of the equation 2vr(e) = p(e) + 2
                           =  (p(e1) + p(e2) + 2) + 2
                           = p(e1 + e2) + 2  # because only two parentheses needed to wrap (e1 + e2)

So, for all e1 and e2 in epsilon, P(e1) and P(e2) implies P((e1 + e2)) by structural induction.







Saturday 13 October 2012

Term Test 1 Question 1


Term Test #1,
Hello, today I'm going to go over the question 1 that baffled me in the first midterm:

1) Use Mathematical Induction to prove that for every natural number n, 13^n - 1 is a multiple of 12.

So this proof is about 1 of the 3 induction we've learned so far. As always we define our predicate P(n) first.

P(n): there exist an integer number z such that 13^n - 1 = 12z

Then we set the base case out.

Base Case:       If n = 0, then 13^n - 1 = 12z = 12*0    [Since z can be any integer]
                        Then we can show that the base case satisfy P(n)
                        So P(0) is true

Then here comes to our induction step.

Induction Step: Assume P(n), we want to prove P(n+1)
           
                        13^(n+1) - 1    =            13(13^n) - 1
                                                =             13((13^n) - 1) + 12             (still retain -1 at the end)
                                                =             13(12z) + 12                         (definition of P(n))
                                                =            12(13z + 1)                        (rearrange it)
                                                =             12 z'                                     (13z + 1 is an integer)
Since we assume n was an arbitrary natural number and P(n), and we derived P(n+1) from it so we can conclude for all natural number n, P(n)

Thursday 20 September 2012

Welcome!

Hello TAs and fellow students,
Welcome to my first SLOG (or second one, first one was mistakenly created on the other site). I hope everyone's doing ok about the first midterm. I personally found it challenging yet rewarding if you did put effort in studying. I enjoy this course so far and hopefully will be for the rest of this term. Stay tuned , peace out!