錯題集

1. You are training a classification model with logistic regression. Which of the following statements are true? Check all that apply.【D】
A. Introducing regularization to the model always results in equal or better performance on the training set.
B. Introducing regularization to the model always results in equal or better performance on the training set.
【解析】如果引入的正則化的lambda參數過大,就會導致欠擬合,從而會導致最后的結果更糟。
C. Adding many new features to the model helps prevent overfitting on the training set.
【解析】增加許多新的特征到預測模型里會讓預測模型更好的擬合訓練集的數據,但是如果添加的特征太多,就會有可能導致過擬合,從而導致不能泛化到需要預測的數據,因而導致預測不夠精準。
D. Adding a new feature to the model always results in equal or better performance on examples not in the training set.
【解析】增加新的變量可能會導致過度擬合,從而導致更糟糕的結果預測,而不是訓練集的擬合。
E. Adding a new feature to the model always results in equal or better performance on the training set.
【解析】增加新的特征會讓預測模型更佳具有表達性,從而會更好的擬合訓練集。By adding a new feature, our model must be more (or just as) expressive, thus allowing it learn more complex hypotheses to fit the training set.


2.Which of the following statements are true? Check all that apply.【BD】
A. Suppose you have a multi-class classification problem with three classes, trained with a 3 layer network. Let a(3)1=(hΘ(x))1 be the activation of the first output unit, and similarly a(3)2=(hΘ(x))2 and a(3)3=(hΘ(x))3. Then for any input x, it must be the case that a(3)1+a(3)2+a(3)3=1.

B.In a neural network with many layers, we think of each successive layer as being able to use the earlier layers as features, so as to be able to compute increasingly complex functions.

C.If a neural network is overfitting the data, one solution would be to decrease the regularization parameter λ.

D.If a neural network is overfitting the data, one solution would be to increase the regularization parameter λ.


3.You are using the neural network pictured below and have learned the parameters Θ(1)=[11?1.55.13.72.3] (used to compute a(2)) and Θ(2)=[10.6?0.8] (used to compute a(3)} as a function of a(2)). Suppose you swap the parameters for the first hidden layer between its two units so Θ(1)=[115.1?1.52.33.7] and also swap the output layer so Θ(2)=[1?0.80.6]. How will this change the value of the output hΘ(x) ?【A】

A.It will stay the same.
B.It will increase.
C.It will decrease
D.Insufficient information to tell: it may increase or decrease.


4. Which of the following statements aretrue? Check all that apply. 【BD】
A. Suppose you are traininga logistic regression classifier using polynomial features and want to selectwhat degree polynomial (denoted d in thelecture videos) to use. After training the classifier on the entire trainingset, you decide to use a subset of the training examples as a validation set.This will work just as well as having a validation set that is separate(disjoint) from the training set.
B. Suppose you areusing linear regression to predict housing prices, and your dataset comessorted in order of increasing sizes of houses. It is then important to randomlyshuffle the dataset before splitting it into training, validation and testsets, so that we don’t have all the smallest houses going into the trainingset, and all the largest houses going into the test set.
**C. **It is okay touse data from the test set to choose the regularization parameter λ, but not themodel parameters (θ).
D. A typical splitof a dataset into training, validation and test sets might be 60% training set,20% validation set, and 20% test set.


5.Suppose you have a dataset with n = 10 features and m = 5000 examples. After training your logistic regression classifier with gradient descent, you find that it has underfit the training set and does not achieve the desired performance on the training or cross validation sets. Which of the following might be promising steps to take? Check all that apply.【AC】
A. Use an SVM with a Gaussian Kernel.
【解析】帶有高斯核的SVM可以擬合出更復雜的決策邊界,這意味著可以一定程度上修正前擬合。
B. Use a different optimization method since using gradient descent to train logistic regression might result in a local minimum.
C. Create / add new polynomial features.
D. Increase λ.

最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
  • 序言:七十年代末,一起剝皮案震驚了整個濱河市,隨后出現的幾起案子,更是在濱河造成了極大的恐慌,老刑警劉巖,帶你破解...
    沈念sama閱讀 227,663評論 6 531
  • 序言:濱河連續發生了三起死亡事件,死亡現場離奇詭異,居然都是意外死亡,警方通過查閱死者的電腦和手機,發現死者居然都...
    沈念sama閱讀 98,125評論 3 414
  • 文/潘曉璐 我一進店門,熙熙樓的掌柜王于貴愁眉苦臉地迎上來,“玉大人,你說我怎么就攤上這事。” “怎么了?”我有些...
    開封第一講書人閱讀 175,506評論 0 373
  • 文/不壞的土叔 我叫張陵,是天一觀的道長。 經常有香客問我,道長,這世上最難降的妖魔是什么? 我笑而不...
    開封第一講書人閱讀 62,614評論 1 307
  • 正文 為了忘掉前任,我火速辦了婚禮,結果婚禮上,老公的妹妹穿的比我還像新娘。我一直安慰自己,他們只是感情好,可當我...
    茶點故事閱讀 71,402評論 6 404
  • 文/花漫 我一把揭開白布。 她就那樣靜靜地躺著,像睡著了一般。 火紅的嫁衣襯著肌膚如雪。 梳的紋絲不亂的頭發上,一...
    開封第一講書人閱讀 54,934評論 1 321
  • 那天,我揣著相機與錄音,去河邊找鬼。 笑死,一個胖子當著我的面吹牛,可吹牛的內容都是我干的。 我是一名探鬼主播,決...
    沈念sama閱讀 43,021評論 3 440
  • 文/蒼蘭香墨 我猛地睜開眼,長吁一口氣:“原來是場噩夢啊……” “哼!你這毒婦竟也來了?” 一聲冷哼從身側響起,我...
    開封第一講書人閱讀 42,168評論 0 287
  • 序言:老撾萬榮一對情侶失蹤,失蹤者是張志新(化名)和其女友劉穎,沒想到半個月后,有當地人在樹林里發現了一具尸體,經...
    沈念sama閱讀 48,690評論 1 333
  • 正文 獨居荒郊野嶺守林人離奇死亡,尸身上長有42處帶血的膿包…… 初始之章·張勛 以下內容為張勛視角 年9月15日...
    茶點故事閱讀 40,596評論 3 354
  • 正文 我和宋清朗相戀三年,在試婚紗的時候發現自己被綠了。 大學時的朋友給我發了我未婚夫和他白月光在一起吃飯的照片。...
    茶點故事閱讀 42,784評論 1 369
  • 序言:一個原本活蹦亂跳的男人離奇死亡,死狀恐怖,靈堂內的尸體忽然破棺而出,到底是詐尸還是另有隱情,我是刑警寧澤,帶...
    沈念sama閱讀 38,288評論 5 357
  • 正文 年R本政府宣布,位于F島的核電站,受9級特大地震影響,放射性物質發生泄漏。R本人自食惡果不足惜,卻給世界環境...
    茶點故事閱讀 44,027評論 3 347
  • 文/蒙蒙 一、第九天 我趴在偏房一處隱蔽的房頂上張望。 院中可真熱鬧,春花似錦、人聲如沸。這莊子的主人今日做“春日...
    開封第一講書人閱讀 34,404評論 0 25
  • 文/蒼蘭香墨 我抬頭看了看天上的太陽。三九已至,卻和暖如春,著一層夾襖步出監牢的瞬間,已是汗流浹背。 一陣腳步聲響...
    開封第一講書人閱讀 35,662評論 1 280
  • 我被黑心中介騙來泰國打工, 沒想到剛下飛機就差點兒被人妖公主榨干…… 1. 我叫王不留,地道東北人。 一個月前我還...
    沈念sama閱讀 51,398評論 3 390
  • 正文 我出身青樓,卻偏偏與公主長得像,于是被迫代替她去往敵國和親。 傳聞我的和親對象是個殘疾皇子,可洞房花燭夜當晚...
    茶點故事閱讀 47,743評論 2 370

推薦閱讀更多精彩內容

  • 一 時間過得可真快,轉眼已經過了一年,我也開始著適應深圳的生活。回想起深圳一年,感觸頗多,無論心態還是狀態都和以前...
    氧氣是個地鐵閱讀 381評論 1 4
  • 大家一定聽到有女孩子因為男生不講衛生,或者不記得自己生日、紀念日等一些小事怒而分手的故事。男生可能覺得女生小題大作...
    小步徐行閱讀 180評論 0 3
  • 還沒開始上宗教課的時候心裡總是覺得很抗拒這個科目,心想著自己天天拿著本聖經跟著大家唱聖歌這還蠻難接受的,但是現在我...
    Denise0112閱讀 378評論 0 0