Q1. Conditional IndependenceConsider the following Bayes net.Activating environment at `~/Dropbox/teaching/PM/PM-notebooks/Project.toml`1.1. Show that is independent of given no other infomration, i.e.1.2. Prove or disprove the following using basic probability (i.e. not using d-separation)Q2. Conditional Independence and CausalityConsider the following modelShow that this causal relationship suggested by the arrows does not necessarily hold, because the identicaldistribution can be represented by a model defined by different conditional distributions. What conditionalindependence assumption does this model make?Q3. Model Complexity, Free Parameters, and Simplifying Assumptions3.1 Consider a general joint probability distribution with variables each of which can havevalues. What is the expression for the joint distribution in terms of conditional probabilities?3.2.What is the total number of free-paramters requried to specify this model? (Note: the term freeparameter means a parameter that is unconstrained. For example a Beroulli distribution to describe a coin fliphas one free parameter to describe, say, the probability of heads; the probability of tails must be ,because the probability is constrained to sum to one.) Provide both the exact expression and a simpler one inbig-O notation.3.3. Now suppose that the complexity of the model is constrained, so that each variable depends on (atmost) other variables and is conditionally independent of the rest, i.e. a Bayes net. Each node has parentsand there are root nodes. How many parameters are required to define this model?3.4. Let us make one more simplifying assumption, which is that in addition to depending on onlyvariables, the conditional probability is described by a noisy-OR function (K=2, see Q3). What is the expressionfor the number of parameters in this case?Q4. Models of Conditional ProbabilityIn Bayesian networks (or directed acyclic graphical models), the joint probability distribution is factored into theproduct of conditional probability distributionsAs we used the previous problem, a simplifying assumption for the conditional probability is noisy-OR modelwhere is an index over the parents of . Note that the exponent is either 0 or 1 so the term is either 1 ordepending on the state of the parent .4.1 Show that the noisy-OR function can be interpreted as a soft (i.e. probabilistic) form of the logicalOR function, i.e. the function gives whenever at least one of the parents is 1.4.2 What is the interpretation of ? Provide a clear explanation.Another choice for the conditional probability is a sigmoid functionwhere is the logistic sigmoid function.4.3 Contrast the noisy-OR function and the sigmoid mathematically. Is one more general than the other?Can each compute unique functions?4.4 Think of two examples, one for the noisy-OR and one for the sigmoid, that contrast the way thesefunctions model the conditional dependencies. Explain how each is appropriately modeled by one function butnot the other.Q5. Car Troubles(Adpted from Barber Exercise 3.6) Your friend has car trouble. The probability of the car starting is described bythe model below, with the probabilities givien in Barber 3.6.5.1 (10 pts) Calculate the , the probability of the fuel tank being empty given that the cardoes not start. Do this by hand, i.e in manner similar to the Inference section in Barber 3.1.1. Use theprobabilities given in the exercise. Show your work.5.2 (5 pts) Implement this network using a toolbox for probabilistic models (e.g. pgmpy or BayesNets.jl ).Use this to verify that your derivation and calculations are correct for the previous problem.5.3 (10 pts) Suppose you have loaned this car to a friend. They call call you and announce, the car wont start.Illustrate your diagnostic and inference process by using the model to show how your beliefs change as you askquestions. Your friend can only tell you the states of and (and you already know ). Use two differentscenarios, i.e. two differnt reasons why the car wont start. For each scenario, your answer should discuss yourchoice of each question you pose to the network, and how it allows you to uncover the true cause the problem.Exploration (20 pts)Like in the first assignment, in this exercise, you have more lattiude and are meant to do creative exploration. Likebefore you dont need to write a book chapter, but the intention is for you to go beyond whats been coveredabove.Implement a belief network of your own choosing or design. It should be more complex that the examples above.It should be discrete (we will cover continous models later). Use the model to illustrate deductive inferenceproblems.Exploration Grading RubricExploration problems will be graded according the elements in the table below. The scores in the column headersindicate the number of points possible for each rubric element (given in the rows). A score of zero for an elementis possible if it is missing entirely.Substandard (+1) Basic (+2) Good (+3) Excellent (+5)PedagogicalValueNo clear statement ofidea or concept beingexplored or explained;lack of motivatingquestions.Simple problem withadequate motivation;still could be a usefuladdition to anassignment.Good choice of problem witheffective illustrations ofconcept(s). Demonstrates adeeper level of understanding.Problem also illustrates orclarifies common conceptualdifficulties ormisconceptions.Novelty ofIdeasCopies existingproblem or makes onlya trivial modification;lack of citation(s) forsource of inspiration.Concepts are similar tothose covered in theassignment but withsome modifications ofan existing exericse.Ideas have clear pedagogicalmotivation; creates differenttype of problem or exercise toexplore related or foundationalconcepts more deeply.Applies a technique orexplores concept notcovered in the assignmentor not discussed at length inlecture.Clarity ofExplanationLittle or confusingexplanation; figureslack labels or usefulcaptions; noexplanation ofmotivations.Explanations arepresent, but unclear,unfocused, wordy orcontain too muchtechnical detail.Clear and concise explanationsof key ideas and motivations.Also clear and concise, butincludes illustrative figures;could be read andunderstood by studentsfrom a variety ofbackgrounds.Depth ofExplorationContent is obvious orclosely imitatesassignment problems.Uses existing problemfor different data.Applies a variation of atechnique to solve a problemwith an interesting motivation;explores a concept in a seriesof related problems.Applies several concepts ortechniques; has clear focusof inquiry that is approachedfrom multiple directions.In [3]: # activate the 491 environment for package version consistencyusing PkgPkg.activate(.)In [4]: using TikzGraphs, LightGraphsg = DiGraph(5)add_edge!(g, 1, 3); add_edge!(g, 2, 3); add_edge!(g, 3, 4); add_edge!(g, 3, 5)TikzGraphs.plot(g, [a, b, c, d, e], options=font=\large)Out[4]:a ba b | a b | eIn [5]: g = DiGraph(3)add_edge!(g, 1, 2); add_edge!(g, 1, 3)TikzGraphs.plot(g, [a, b, c], options=font=\large)Out[5]:N x1 xN K 1 m mmmp(x) =Ni=1p(xi|pa(xi))p(xi|pa(xi)) = 1 (1 i0) jpa(xi)(1 ij)xjj xi xj1 ij xjxi = 1i0p(xi|pa(xi)) = wi0 + jpa(xi)wijxj , where (a) = 11 + ea(a)In [8]: using TikzPictures, LaTeXStrings# draw the nodes, then make the linksg = TikzPicture(Ltikzstyle{every node} = [draw, minimum size=7mm, rounded corners=2mm]tikzset{>=latex}foreach
/x/y/label in {b/1/3/battery, f/3/3/fuel, g/2/2/gauge, t/1/1/turns over, s/3/1/starts}
ode (
) at (x,y) {label};foreach from/to in {b/t, b/g, f/g, f/s, t/s}draw [->] (from) (to);, options=scale=1.25, thick, transform shape)Out[8]:p(f = empty|s = no)t g sIn [ ]
EECS491
[Solved] EECS491 Assignment2-Conditional Independence
$25
File Name: EECS491_Assignment2_Conditional_Independence.zip
File Size: 414.48 KB
Only logged in customers who have purchased this product may leave a review.

![[Solved] EECS491 Assignment2-Conditional Independence](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip.jpg)

![[Solved] EECS491 Assignment3-MRFs and Images Denoising](https://assignmentchef.com/wp-content/uploads/2022/08/downloadzip-1200x1200.jpg)
Reviews
There are no reviews yet.