# Exam: 11 Feb 09

Posted by bobthebuilder
Announcements Last Post
SoC Curricula 09/30/2017 01:08PM
Demarcation or scoping of examinations and assessment 02/13/2017 07:59AM
School of Computing Short Learning Programmes 11/24/2014 08:37AM
Unisa contact information 07/28/2011 01:28PM
 Exam: 11 Feb 09 January 22, 2009 11:00PM Registered: 11 years ago Posts: 22 Rating: 0
Is anyone else studying more than just chapters 18 - 21? I have summarised pieces of chapters 1, 7, 8, 9, 13, 14, 16 and 17, as well as chapter 18 - 20 (hoping to wrap up chapter 21 this week). I don't think I would ever have gotten my head around some parts of chapters 18-20 without having done this.

I am planning to work through every exercise in chapters 18 - 21 as well, before the exam - is anyone else planning to do this and do you want to share answers? I have really struggled with 19.4, I just can't seem to understand the semantics here in terms of the trees in figures such as Fig.19.13 - anyone else have this problem and do you think p.703 "Inductive learning with inverse deduction" is important, considering the Tutorials just concentrated on the FOIL algorithm aka p.701 "Top-down inductive learning methods".

Another problem I am having is with the jargon - every single concept or algorithm or learning methods seems to need a name at least 5 - 10 words long and I just can't keep track of all the concepts I have read, all the names are merging into one that sounds like "Intuitive inductive explanation-based learning attribute decision net tree gooblegook googlegook something something" ( Really struggling to differentiate between the different concepts I'm afraid.

I haven't done any maths since 2001 and my formal logic course was 2 years ago - all the differentiation and logarithms are scaring me silly - I can't seem to remember the detail any more. It used to come so easily, now I'm afraid I feel proud of myself if I can just remember what a logarithm actually is!

I am just hoping I pass - this is my 10th and last exam and then I'll be done, with luck!!
 Re: Exam: 11 Feb 09 January 26, 2009 12:07AM Registered: 11 years ago Posts: 22 Rating: 0
Anyone else started studying for this yet? Feels like I am shouting into the void here
 Re: Exam: 11 Feb 09 January 26, 2009 12:41PM Registered: 14 years ago Posts: 3,747 Rating: 0
I am trying to start, I have too much other stuff getting in the way though. Will probably start in ernest tonight. If you want to be generous and share those summaries please send them this way

--
"Knowledge has much better uses than self-pity and superiority"
 Re: Exam: 11 Feb 09 January 29, 2009 05:08PM Registered: 14 years ago Posts: 17 Rating: 0
I have started but are down to studying 18 and 19. The stats stuff is a complete mystery to me.

I did formal logic today so at least that bit I know, but same with the math stuff. Also not sure what to do on the swarm bit, know what I think is the overview of it, but would have been nice to actually get a model answer of what we need to know.
 Re: Exam: 11 Feb 09 January 29, 2009 10:21PM Registered: 14 years ago Posts: 3,747 Rating: 0
My submission for swarm was commented that I must cover the algorithms in more detail. So I suspect the algorithms might feature in the exam.

--
"Knowledge has much better uses than self-pity and superiority"
 Re: Exam: 11 Feb 09 February 02, 2009 06:17PM Registered: 11 years ago Posts: 22 Rating: 0
I got 70% for my swarm intelligence tutorial - would be happy to swap answers with anyone else who got 60%+ for this tut? Must say, I only did a 2.5 page summary of all the main algorithms / history etc though and didn't get into any technical detail. I assume though that we are going to get a 20pt question saying "Discuss the principles and most important algorithms of Swarm Intelligence and give examples of applications." I am going to try flesh out my tutorial answer to match this.

I don't get the impression the stats stuff is that important, but I have made sure I can do the Bayesian calculations if required in the exam. I have sped past most of chapter 20 except 20.1, the neural network stuff and the support vector machines. The other stuff just seems to harp on the same theme, but in more mathematical complexity than I think we are expected to get into.

I am completely not loving chapter 21 - the sample answers are exactly that - there's nothing to connect them with the heavy theory that makes up most of this chapter and I am left completely in the dark as to where these answers came from.
 Re: Exam: 11 Feb 09 February 03, 2009 01:51PM Registered: 10 years ago Posts: 5 Rating: 0
Does anyone know what the format of the paper looks like? The uncertainty is incredible..
The exam guidelines lists Unification as a topic equal to Swarm Intelligence but this manages to get about 3 pages of textbook coverage..

Are the assignment questions a good indication of exam questions? (this seems unlikely considering how long most questions took to answer).

Is anyone else just about in a flat panic ?
 Re: Exam: 11 Feb 09 February 03, 2009 06:14PM Registered: 14 years ago Posts: 17 Rating: 0
Not really in a flat panic, just basically given up on passing this one. Like you said the uncertainty, would have been nice to get a old exam paper or at the very least an example paper. Guess we will find out on the day...
 Re: Exam: 11 Feb 09 February 03, 2009 07:53PM Registered: 10 years ago Posts: 5 Rating: 0
hmm.. the tutorial letter 101 doesnt contain a lecturer email address, and osprey doesn't list one. I tried sending an email through myunisa, lets hope a reply is reasonably timeous..
Otherwise, im kinda not sure if this is going to be one of those papers that are dead easier than imagined (unlikely) or completely out of left field

good luck..
 Re: Exam: 11 Feb 09 February 04, 2009 12:32PM Registered: 14 years ago Posts: 3,747 Rating: 0
This is a new subject(or at least first time it has been offered in a while) so seeing a past paper isn't even an option sadly
Previous experience with new UNISA subjects in the past tells me that this paper will probably be horrendously hard. Although I am hoping that this one will be different.

The fact that the lecturer is elusive definitely doesn't make me feel any more at ease.

--
"Knowledge has much better uses than self-pity and superiority"
 Re: Exam: 11 Feb 09 February 06, 2009 06:10PM Registered: 11 years ago Posts: 22 Rating: 0
I have been studying this subject flat out for about a month and I keep approaching flat panic. I am just hoping for 50%. I have no idea how the paper is going to be presented and I know if it ends up being heavily based on chapter 21, I will probably have a hernia and keel over in the exam hall. Chap 21 is NOT my favourite chapter, nor is chapter 19. I am started feel a little bit more secure on the decision tree, neural network and bayesian front though and I'm just going to memorise my paper on Swarm Intelligence.

I am basically just preparing myself for a horror of a paper, that way I won't hyperventilate when I see it.

Good luck to all of you in any case
 Re: Exam: 11 Feb 09 February 09, 2009 06:56PM Registered: 11 years ago Posts: 22 Rating: 0
Following on from my question posted on 22 Jan - has anyone figured out how to do exercise 19.4 yet?

In the "Inductive Learning with Inverse Deduction" section, it seems to imply that C1 /\ C2 --> C, yet exercise 19.4 has C = true implying C1 and other bits and pieces of clauses.

I just don't get it? Is this syntax used anywhere else in the book, does anyone know?
 Re: Exam: 11 Feb 09 May 16, 2009 11:10AM Registered: 14 years ago Posts: 78 Rating: 0
Hi Bob,

I realize you've probably aced this subject already, but was wondering if you ever figured out 19.4. I'm having similar troubles (yes it's part of the assingment this year too ). To jog your memory a little, part a reads as follows:

The resolvent C: True -> P(A,B )
Clause 1 C1: P(x,y) -> Q(x,y)
Clause 2 C2: ???

Now for resolution to apply, I realise that these sentences need to be in CNF, so I converted them to the following:

C: Not(True) V P(A,B ) , which is equivalent to: False V P(A,B ) , which is equivalent to P(A,B )
C1: Not(P(x,y) V Q(x,y)), which I'll write as !P(x,y) V Q(x,y)

But for C1 and C2 to resolve to C, which has only a single literal, they must have exactly three literals between them and C1 already has two. Since Q is not in C, !Q(x,y) (or anything that unifies with that) must be C2. However, that leaves us with !P(x,y) or, at best !P(A,B ).

What am I doing wrong? Have I converted clausal form incorrectly? Or have I misunderstood the resolution step? Or do I just not fully grok Inverse Resolution yet?