00:00
Sunk cost fallacy.
沉沒成本謬誤。
00:02
You paid for the meal.
你已經付了餐費。
00:03
It tastes like rubber and regret, but you eat it anyway because you don't want to waste it.
這頓飯吃起來像橡膠加後悔,但你還是把它吃完,因為不想浪費。
00:09
You're experiencing the sunk cost fallacy.
你正經歷沉沒成本謬誤。
00:12
Sticking to something just because you've already put time or money into it.
只因為已經投入時間或金錢,就堅持做某件事。
00:16
People hang on to bad jobs, finish books they hate, or keep fixing cars that should have
人們死守爛工作、硬看完討厭的書,或不斷修理早該報廢的車,
00:19
been scrapped just because they've already put time or money in.
只因為已經投入時間或金錢。
00:25
The Concord Jet is a classic case.
協和客機就是經典案例。
00:27
It was fast, cool, and a total money pit.
它飛得快、很酷,但完全是個燒錢坑。
00:28
Everyone knew it wasn't worth it, but they kept spending just to avoid admitting it flopped.
大家都知道它不划算,但為了避免承認失敗,他們還是繼續砸錢。
00:34
That's the sunk cost trap.
這就是沉沒成本陷阱。
00:35
When letting go feels worse than wasting more, but in reality, quitting at the right time isn't
當放手的感覺比繼續浪費更糟時,但實際上,適時放棄並不是失敗。
00:40
It's just smart.
反而是明智之舉。
00:42
Dunning Krueger effect.
達克效應。
00:43
Someone watches a 5-minute tutorial and suddenly thinks they could teach the class.
有人看完五分鐘教學影片,就突然覺得自己能教這門課。
00:48
That person has the Dunning Krueger effect.
這個人就具有達克效應。
00:51
It's when people with low ability think they're experts, while real experts
能力不足的人會自認是專家,而真正的專家
00:53
underestimate themselves.
卻往往低估自己。
00:56
The catch is that you need some knowledge just to realize how much you don't know.
關鍵在於,你需要具備一定的知識,才能意識到自己有多無知。
01:01
In one study, the worst performers thought they aced a logic test.
在一項研究中,表現最差的人認為自己在邏輯測驗中表現優異。
01:04
The actual top scorers assume they did average.
實際得分最高的人則認為自己表現平平。
01:06
Confidence doesn't follow skill.
自信並不隨著技能而來。
01:09
It follows ignorance.
它伴隨著無知而來。
01:09
And the less you know, the less you notice your own mistakes.
你知道的越少,就越難注意到自己的錯誤。
01:12
So, next time someone starts talking like they've solved climate change after reading a blog post, maybe
所以,下次有人讀了一篇部落格文章後,就開始高談闊論說他們解決了氣候變遷問題,也許
01:17
don't take notes.
不用太當真。
01:20
Framing effect.
框架效應 (Framing effect)。
01:22
Tell someone a surgery has a 90% success rate and they'll nod.
告訴某人一場手術的成功率是 90%,他們會點頭同意。
01:24
Tell them it has a 10% failure rate and suddenly they hesitate even though it's the same number.
告訴他們失敗率是 10%,儘管數字相同,他們突然就會猶豫不決。
01:30
That would be the framing effect.
這就是框架效應。
01:32
The way information is presented changes how you feel about it.
資訊呈現的方式會改變你對它的感受。
01:35
Not the facts, just the packaging.
改變的不是事實,而是包裝。
01:36
Say n out of 10 survived and it's a relief.
說 10 人中有 8 人存活,會令人鬆一口氣。
01:39
Say 1 in 10 died and it's a warning.
說 10 人中有 2 人死亡,則像是一種警告。
01:42
Marketers use it and politicians live by it.
行銷人員利用它,政治人物更是以此為生。
01:45
And your brain rarely stops to check the math.
而你的大腦很少會停下來檢查運算。
01:49
It reacts to tone, not truth.
它對語氣做出反應,而不是對真相。
01:49
So even when the data is identical, the story it tells can flip just by swapping a few
所以,即使數據完全相同,只要替換幾個
01:53
words.
字眼,它所傳達的故事就可能完全翻轉。
01:54
Confirmation bias.
確認偏誤。
01:56
Ever Google something hoping to prove a point and somehow every result agrees with you?
你是否曾為了證明某個觀點而上網搜尋,結果所有搜尋結果都恰好支持你的想法?
02:01
That's not research.
這不是研究。
02:03
That's confirmation bias.
這是確認偏誤。
02:03
The brain's habit of cherrypicking info that supports what you already believe.
大腦習慣挑選那些支持你既有信念的資訊。
02:07
It doesn't care if it's right.
它不在乎資訊是否正確。
02:09
It cares if it feels right.
它在乎的是感覺是否正確。
02:11
If you believe in something strongly, your brain filters everything else.
如果你對某件事深信不疑,大腦就會過濾掉其他一切。
02:16
Studies, memories, even search results get bent to fit the story you already like.
研究、記憶,甚至是搜尋結果,都會被扭曲以符合你本來就喜歡的故事。
02:19
That's how people become experts with zero opposing input.
這就是人們如何在毫無反對意見的情況下成為「專家」。
02:23
It's not curiosity, it's mental comfort food, and it tastes great until you realize you're living in
這不是好奇心,而是心靈的安慰劑,味道很好,直到你發現自己活在
02:28
a customuilt echo chamber.
一個量身打造的迴聲室裡。
02:31
Imagine this.
想像一下。
02:32
Omission bias.
不作為偏誤。
02:33
You see someone choking and you could save them by acting.
你看到有人窒息,只要採取行動就能救他。
02:36
You freeze.
你僵住了。
02:39
People might still say, "Well, you didn't do anything wrong." That would be omission
人們可能還是會說:「嗯,你沒做錯什麼。」這就是
02:43
The idea that doing harm is worse than letting harm happen.
這種心態認為造成傷害比任由傷害發生更糟糕。
02:48
Same result, different guilt.
結果相同,但罪惡感不同。
02:50
This shows up in medicine parenting leadership everywhere big choices live.
這在醫學、育兒、領導等所有重大抉擇的領域中隨處可見。
02:56
Doing nothing feels safer.
按兵不動讓人感覺更安全。
02:56
But that's just the brain's way of staying clean while others pay the cost.
但這只是大腦在保持自身清白,而讓他人承擔代價的方式。
03:01
The scariest part is that the bias works even when logic says otherwise because emotionally
最可怕的是,即便邏輯上說不通,這種偏見依然有效,因為在情感上,
03:05
passivity feels innocent even when it isn't.
被動即使不該無辜,也會讓人感覺無辜。
03:08
Choice supportive bias.
決策合理化偏誤(Choice supportive bias)。
03:11
You regret buying that phone.
你後悔買了那支手機。
03:13
It lags when you open apps.
每當打開應用程式時,它都會卡頓。
03:14
The battery dies by lunch.
電池到午餐時間就耗盡了。
03:15
And autocorrect turns help into hello.
自動校正還會把「幫忙」變成「哈囉」。
03:18
But instead of returning it, you say, "It's not that bad.
但你沒有退貨,反而說:「沒那麼糟啦。
03:22
At least it has Bluetooth." You would call that choice supportive bias.
至少它還有藍牙。」這就是所謂的決策合理化偏誤。
03:25
When your brain edits the story to make your own decisions look smarter than they were.
當你的大腦修改故事,讓自己的決定看起來比實際更明智時。
03:30
The worse the choice, the harder your brain works to defend it.
決策越糟糕,大腦就越努力為其辯護。
03:33
Suddenly, that three-hour battery life is a reminder to unplug.
突然間,三小時的電池壽命成了提醒你放下手機的契機。
03:37
And dropped calls are a feature, not a bug.
掉電話成了特色,而不是缺陷。
03:39
It's not just phones.
不只手機如此。
03:42
You'll see this with jobs, friends, and even political votes.
在工作、朋友甚至政治投票上,你也會看到這種現象。
03:45
Admitting you messed up feels worse than the mess itself.
承認自己搞砸了,比搞砸本身更讓人難受。
03:50
So, your brain rewrites the past and sticks a smiley face on it.
所以,你的大腦會重寫過去,並在上面貼上一張笑臉。
03:53
The halo effect.
光環效應。
03:54
If a person walks in with a sharp outfit and a confident smile, before they even
如果一個人穿著筆挺的套裝、帶著自信的微笑走進來,
03:57
speak, your brain fills in the blanks.
在他們開口之前,你的大腦就已經填補了空白。
03:59
Smart kind capable probably volunteers on weekends and eats kale for fun.
聰明、善良、能幹,可能週末還當志工,吃羽衣甘藍當樂趣。
04:04
That's the halo effect.
這就是光環效應。
04:07
One good trait makes the rest look golden.
一個優點讓其他方面看起來都閃閃發光。
04:10
It's not logic, it's reflex.
這不是邏輯,而是反射動作。
04:11
We trust charm over facts, confidence over content.
我們信任魅力勝過事實,信任自信勝過內容。
04:15
A good first impression isn't just strong, it's sticky, and it cuts both ways.
好的第一印象不僅強烈,還很難抹滅,而且它有雙面性。
04:20
One awkward moment and that same person becomes careless, cold or shady.
只要一個尷尬的時刻,同一個人就會變得粗心、冷漠或鬼祟。
04:25
Their halo slips and suddenly it's horns.
他們的光環滑落,突然變成了惡魔的角。
04:28
The brain loves fast judgments.
大腦喜歡快速判斷。
04:30
The underdog wins the game and suddenly everyone knew it all along.
黑馬贏了比賽,突然間所有人都說他們早就知道會這樣。
04:30
Hindsight bias.
後見之明偏差。
04:34
Except they didn't.
但他們其實並不知道。
04:36
This is hindsight bias.
這就是後見之明偏差。
04:36
Your brain's habit of rewriting the past to make you feel like a genius in retrospect.
你的大腦習慣重寫過去,讓你在回顧時覺得自己像個天才。
04:42
Once you know the ending, everything gets rebranded as obvious.
一旦你知道了結局,一切都會被重新包裝成理所當然。
04:45
They looked confident in warm-ups.
他們穿著運動服時看起來很有自信。
04:49
The signs were there.
證據早就擺在眼前。
04:49
Random guesses feel like predictions, and uncertainty gets erased.
隨機的猜測被當成了預測,不確定性就此消失。
04:52
It shows up after elections, failed startups, breakups, even plot twists.
它在選舉後、新創失敗時、分手時,甚至劇情大逆轉時出現。
04:57
Once the answer's known, your brain fills in the blanks to pretend it was never confused.
一旦答案揭曉,你的大腦就會填補空白,假裝自己從未困惑過。
05:03
You didn't see it coming.
你當初沒看出來。
05:03
You just got a spoiler.
你只是被劇透了。
05:06
and your brain to credit.
而你的大腦會歸功於此。
05:07
Authority bias.
權威偏誤。
05:09
A guy in a lab coat says something sketchy, but instead of questioning it, your
穿著白袍的人說了些可疑的話,但你非但沒有質疑,你的
05:12
brain nods like he's the truth fairy.
大腦卻點頭如搗蒜,彷彿他是真理仙子。
05:15
Welcome to authority bias, where you trust someone just because they look like they know stuff.
歡迎來到權威偏誤,在這裡,你只因為某人看起來懂很多就信任他。
05:21
It happens in hospitals, classrooms, offices, and even on YouTube.
這種情況發生在醫院、教室、辦公室,甚至在 YouTube 上。
05:25
The right tone of voice or a fancy title can override logic.
正確的語氣或華麗的頭銜,都能凌駕於邏輯之上。
05:30
A teacher says it, then it must be true.
老師說的,那就一定是真的。
05:30
Someone in a suit definitely reliable.
穿西裝的人絕對可靠。
05:33
The brain skips the fact check and just stamps approved.
大腦跳過查證事實的步驟,直接蓋上核可章。
05:37
One study found that kids believed wrong answers more often if the person wore a uniform.
一項研究發現,如果發言者穿著制服,孩童更容易相信錯誤的答案。
05:42
Adults were the same deal, just with more confidence and less awareness.
成年人也是一樣,只是更有自信,也更缺乏自覺。
05:45
Survivorship bias.
幸存者偏誤。
05:47
You hear about one person who made millions selling pet rocks.
你聽說過有人靠賣寵物石頭賺了幾百萬。
05:50
Next thing you know, you're out back giving names to pebbles.
接著你就會開始幫小石頭取名字。
05:54
Turns out survivorship bias makes it easy to forget the thousands who tried and
事實證明,存活者偏差讓我們很容易忘記那成千上萬嘗試過卻失敗的人。
05:58
It shows up in businesses, fitness, and self-help.
它出現在商業、健身和自我成長領域。
06:02
Anywhere winners get the spotlight.
任何贏家成為焦點的地方。
06:04
But for every winner, thousands failed quietly.
但每一個贏家背後,都有成千上萬的人默默失敗。
06:07
You just don't hear from them because failure doesn't post on Instagram.
你只是聽不到他們的聲音,因為失敗不會在 Instagram 上發文。
06:10
Even in war, this mattered.
即使在戰爭中,這點也很重要。
06:11
During World War II, engineers studied the bullet holes on planes that made it back to base and started
二戰期間,工程師研究了返航飛機上的彈孔,並開始
06:16
reinforcing those areas.
加固那些區域。
06:19
It took one person to point out the obvious.
需要一個人來點出這個顯而易見的事實。
06:22
The real danger was in the spots that weren't damaged because the planes that
真正的危險在那些沒有損壞的地方,因為被擊中那些部位的飛機
06:24
were hit there never made it home.
根本飛不回來。
06:26
Spotlight effect.
聚光燈效應。
06:28
If you were to trip on the stairs and think, "Ah, everyone saw that." They didn't.
如果你在樓梯上絆倒,心想:「啊,大家都看到了。」其實他們沒有。
06:33
Your brain just pulled the spotlight effect.
你的大腦只是產生了聚光燈效應。
06:36
The illusion that people notice you way more than they actually do.
一種人們過度關注你的錯覺。
06:39
It's your brain putting you on center stage in a play no one's watching.
這是你的大腦讓你站在沒人看的舞台中央。
06:42
You fumble a word, wear mismatched socks, or have a bad hair day and feel like the entire planet's
你說錯話、穿錯襪子,或是髮型糟糕,就覺得全世界都在
06:48
Reality check.
現實檢驗。
06:49
They're not.
其實沒人注意。
06:51
In one study, people wore embarrassing t-shirts in public.
一項研究中,人們在公共場合穿著尷尬的T恤。
06:53
They guessed half the room noticed.
他們猜有一半的人會注意到。
06:55
Barely anyone did.
結果幾乎沒人發現。
06:55
You're not the star of the show.
你不是眾人矚目的焦點。
06:57
Everyone else is too busy worrying they're the star.
其他人都忙著擔心自己才是焦點。
07:02
Anchoring bias.
錨定效應(Anchoring bias)。
07:05
You see a $2,000 TV, then one for $800.
你看到一台2000美元的電視,然後看到一台800美元的。
07:07
That second one feels like a bargain, even if you only plan to spend $300.
即使你只打算花300美元,第二台還是讓你覺得像撿到便宜。
07:11
This mental shortcut is known as the bias in action.
這種思維捷徑就是偏誤在運作。
07:13
The first number sets the bar and your brain starts comparing everything to it.
第一個數字設定了基準,你的大腦會開始拿所有東西跟它比較。
07:17
This mental shortcut shows up in stores, salary talks, trivia, and even dating.
這種思維捷徑會出現在商店、薪資談判、冷知識,甚至約會中。
07:22
First impressions leave a price tag in your mind and it doesn't matter if it's random.
第一印象會在你心中貼上價格標籤,即使那是隨機的也無所謂。
07:27
In one study, people spun a wheel rigged to land on 10 or 65, then guessed how many countries are in
一項研究中,人們轉動一個動過手腳的輪盤,結果停在10或65,然後猜非洲有多少個國家。
07:33
The higher the spin, the higher the guess.
轉出來的數字越高,猜的數量就越高。
07:35
The wheel meant nothing, but the number stuck anyway.
輪盤毫無意義,但數字就這樣烙印在心裡。
07:39
Your brain doesn't ask, "Does this make sense?" It asks, "What number did I see first?" and
你的大腦不會問:「這合理嗎?」它問的是:「我第一個看到的數字是什麼?」然後
07:43
builds reality around it.
根據它來建構現實。
07:47
One person buys it, then 10, then a million, and suddenly it feels like you have to join in, even if you
一個人買了它,然後十個人,接著一百萬人,突然間你感覺自己也必須加入,即使你
07:47
Bandwagon effect.
從眾效應。
07:52
have no idea what it is.
根本不知道那是什麼。
07:55
This is the bandwagon effect.
這就是從眾效應。
07:56
It's why people wear trends they hate or back causes they barely understand.
這就是為什麼人們會穿著自己討厭的流行服飾,或支持自己幾乎不了解的訴求。
08:01
When something looks popular, your brain doesn't doublech checkck.
當某樣東西看起來很受歡迎時,你的大腦不會再次確認。
08:04
Instead, it copies it.
相反地,它會直接模仿。
08:07
This bias isn't new.
這種偏誤並不新鮮。
08:07
In 17th century Holland, tulips became so trendy, people sold homes just to buy bulbs for a flower.
在 17 世紀的荷蘭,鬱金香變得非常流行,人們甚至賣掉房子只為了買花的球莖。
08:14
The market crashed.
市場崩盤了。
08:16
Obviously, the bandwagon doesn't care if something's good.
顯然,從眾效應不在乎某樣東西好不好。
08:18
It only cares if it's crowded.
它只在乎它熱不熱門。
08:21
And once everyone's cheering, your brain worries about being the one person not
一旦所有人都在歡呼,你的大腦就會擔心自己成為唯一一個不
08:25
So you clap too.
所以你也跟著鼓掌。
08:27
10 compliments, one insult.
十句讚美,一句侮辱。
08:27
Negativity bias.
負面偏誤。
08:30
Guess which one keeps you up at night.
猜猜哪一句會讓你輾轉難眠。
08:33
And psychologists call this the negativity bias.
心理學家稱此為負面偏誤。
08:34
Your brain's habit of giving more weight to bad experiences than good ones.
你的大腦習慣對糟糕的經歷賦予比美好經歷更高的權重。
08:38
It's not because you enjoy being miserable.
這並不是因為你喜歡自找苦吃。
08:40
It's survival wiring.
這是生存本能。
08:43
Our ancestors didn't need to remember every nice sunset, but they did need to
我們的祖先不需要記住每一個美好的日落,但他們確實需要
08:46
remember the time berries nearly killed someone.
記住那些漿果差點害死人的經歷。
08:49
So today, you forget the praise, but replay that one awkward moment on a loop.
所以今天,你會忘記讚美,卻在腦海中不斷重播那一次尷尬的場景。
08:54
This bias shows up in relationships, news, memory, and even parenting.
這種偏誤會出現在人際關係、新聞、記憶,甚至育兒中。
08:57
You'll obsess over one rude comment, even if everything else was fine.
你會對一句無禮的評論耿耿於懷,即使其他一切都很好。
09:01
Your brain thinks it's protecting you, but mostly it's just being dramatic.
你的大腦以為在保護你,但大多數時候只是在小題大作。
09:05
Negativity isn't truth.
負面想法並不代表真相。
09:07
It's just the brain yelling louder when it's scared.
只是大腦在恐懼時大聲喊叫罷了。
09:09
Bias blind spot.
偏誤盲點。
09:11
You can spot bias in everyone else.
你能看出別人身上的偏誤。
09:14
Your uncle's politics, your friend's taste in movies, that guy at work who does his own
你叔叔的政治立場、你朋友的電影品味、那個在公司裡自以為
09:17
research, but your own thinking feels perfectly neutral.
做研究的傢伙,但你自己的思維卻感覺完全中立。
09:22
You then have the bias blind spot, the belief that everyone else is biased except you.
這就是偏誤盲點,認為除了自己之外,每個人都有偏誤。
09:27
Even when people are told how cognitive biases work, they rate themselves as
即使人們被告知認知偏誤是如何運作的,他們仍認為自己
09:30
less biased than others.
比其他人更少偏誤。
09:33
It's not because they're lying.
這並不是因為他們在說謊。
09:34
It's because the brain doesn't flag its own shortcuts.
是因為大腦不會標記自己的捷徑。
09:36
You don't see the tilt when you're standing on the slope.
當你站在斜坡上時,你不會感覺到傾斜。
09:40
The result is that you overrust your opinions, underexamine your instincts, and assume
結果就是你過度信任自己的觀點,未能充分檢視自己的直覺,並假設
09:44
your side is the clear-headed one.
自己這一方才是頭腦清醒的。
09:46
Proportionality bias.
比例偏誤(Proportionality bias)。
09:49
A disaster strikes and people go looking for a mastermind.
當災難發生時,人們會去尋找幕後主使。
09:51
A bridge collapses.
橋樑倒塌了。
09:53
Then it must have been sabotaged.
那一定是有人蓄意破壞。
09:54
A public figure dies.
有位公眾人物去世了。
09:55
Someone had to have planned it.
一定是有人策劃了這件事。
09:58
It's hard to accept that something massive could happen because of something
人們很難接受如此重大的事件,竟然可能是因為一些
10:00
stupid.
愚蠢的小事而發生。
10:01
A bad wire, a missed warning, a guy who didn't double check.
一條壞掉的電線、一個被忽略的警告、一個沒有再次確認的人。
10:06
That's proportionality bias.
這就是比例偏誤。
10:06
The brain's way of matching the size of an outcome to the size of a cause.
大腦試圖讓結果的規模與起因的規模相匹配的方式。
10:10
Big things shouldn't fall apart for small reasons.
重大的事情不該因為微不足道的理由而崩潰。
10:14
So, we invent bigger ones, something darker, something that makes it feel earned.
所以我們會編造出更重大的理由,一些更黑暗的內幕,讓這件事感覺像是必然的結果。
10:17
It's why conspiracy theories spread after tragedy.
這就是為什麼陰謀論總是在悲劇之後蔓延。
10:20
We're not wired to accept chaos.
我們天生就難以接受混亂。
10:22
We'd rather believe in villains than in bad luck and loose screws.
與其相信純粹的運氣不好或螺絲鬆了,我們寧願相信有壞人存在。
10:27
Most people think they're better than average drivers, better than average thinkers, and somehow less
大多數人認為自己開車技術比平均好,思考能力也比平均強,而且不知為何
10:28
Bias blind spot.
偏誤盲點(Bias blind spot)。
10:32
biased than everyone else.
比其他人更有偏見。
10:36
That little illusion is called the bias blind spot.
這個小幻覺被稱為偏見盲點。
10:37
Believing that mental traps are a problem for other people, not you.
相信心理陷阱是別人的問題,而不是你的。
10:43
You spot confirmation bias in your uncle's rants, halo effect in your friend's
你在叔叔的咆哮中發現確認偏誤,在朋友的迷戀中看到光環效應,
10:45
crush, and anchoring in your boss's decisions.
在老闆的決策中看到錨定效應。
10:49
But when you make the same mental slip, it feels like pure logic, not bias.
但當你犯了同樣的心理失誤時,卻感覺那是純粹的邏輯,而不是偏見。
10:54
The blind spot isn't about stupidity.
盲點與愚蠢無關。
10:56
It's about self-rust.
而是與自我蒙蔽有關。
10:59
Your brain assumes it sees clearly, even while filtering everything through
你的大腦假設自己視野清晰,即使它正透過
11:01
invisible lenses.
看不見的鏡片過濾一切。
11:03
Ironically, knowing about bias doesn't fix it.
諷刺的是,了解偏見並不能解決它。
11:07
In some cases, it just makes you feel immune, which makes you even more biased.
在某些情況下,這只會讓你感覺免疫,從而讓你更有偏見。
11:11
People tend to misjudge how much their emotions will shape their decisions, especially when they're not
人們往往會誤判情緒對其決策的影響程度,尤其是當他們目前
11:12
The empathy gap.
同理心落差。
11:16
currently feeling those emotions.
並未感受到那些情緒時。
11:20
This is called the empathy gap.
這被稱為同理心落差。
11:21
When you're calm, you believe you'll stay cool in a fight.
當你冷靜時,你相信自己在爭吵中也能保持冷靜。
11:23
When you're full, you think grocery shopping while hungry won't change what ends up in your cart.
當你吃飽時,你認為在飢餓時去超市購物不會改變你最終放進購物車的東西。
11:29
But when emotions hit, logic usually gets shoved out of the way.
但當情緒來襲時,邏輯通常會被推到一邊。
11:32
Say someone swears they'll stick to their diet.
假設有人發誓會堅持他們的飲食計畫。
11:33
Sounds easy when they're not staring down a pizza after a long stressful day.
在經過漫長壓力大的一天後,面對披薩時就不會這麼容易了。
11:38
In the moment, the brain forgets all the earlier promises because it didn't plan for being tired, hungry, and annoyed.
在當下,大腦會忘記所有先前的承諾,因為它並未計畫要應對疲憊、飢餓與煩躁。
11:44
The empathy gap is that blind spot.
同理心差距就是這個盲點。
11:47
Your brain assumes your future self will act like your present self, but it won't.
你的大腦假設未來的你會像現在的你一樣行動,但事實並非如此。
11:51
The IKEA effect.
宜家效應(IKEA effect)。
11:53
You spend hours building a wobbly shelf.
你花了好幾個小時組裝一個搖搖晃晃的層架。
11:57
The instructions made no sense.
說明書根本看不懂。
11:58
Three screws are still on the floor, but when it's done, you love it.
還有三顆螺絲掉在地上,但當它完成時,你卻愛上它。
12:01
That would be the IKEA effect.
這就是宜家效應。
12:02
Your brain inflates the value of anything you built yourself.
你的大腦會高估任何你親手打造之物的價值。
12:05
It's not just furniture.
不僅限於家具。
12:08
People defend messy projects, broken code, or bad ideas simply because they put effort
人們會為凌亂的專案、糟糕的程式碼或不好的想法辯護,只因為他們為此付出了心力。
12:12
into it.
花費的心力越多,就越難承認它其實不怎麼樣,即使它歪向一邊。
12:13
The more work it took, the harder it is to admit it's not great, even if it leans to the left.
這種偏見就是為什麼DIY套件讓人更有成就感,以及為什麼創作者會高估自己作品的品質。
12:21
This bias is why DIY kits feel more rewarding and why creators overestimate how good their
這與品質無關,而是與擁有權有關。
12:24
work is.
你的大腦將心力誤認為品質,並將其稱為自豪。
12:25
It's not about quality, but about ownership.
虛構真理效應(Illusory truth effect)。
12:29
Your brain mistakes effort for quality and calls it pride.
你的大腦會將付出的心力誤判為品質,並把這種感受稱作自豪。
12:32
Elucory truth effect.
Elucory 真相效應。
12:34
Hear something a few times and it starts to feel true even if it isn't.
一句話聽了幾次,就算不是真的,也會開始覺得它是真的。
12:39
This is a lucery truth effect at work.
這就是鮮活真相效應在發揮作用。
12:41
The brain loves what it already recognizes and sometimes it confuses repetition with reliability.
大腦喜歡它已經識別的東西,有時會把重複與可靠性混為一談。
12:46
It's how rumors keep going.
謠言就是這樣流傳的。
12:49
How madeup facts turn into common knowledge and why something you heard once in passing
虛構的事實如何變成常識,以及為何你偶然聽過一次的東西
12:52
might still sound right years later.
幾年後聽起來可能還是對的。
12:56
Not because it is, but because your brain is used to it.
不是因為它是對的,而是因為你的大腦已經習慣了。
12:58
Even smart people fall for it.
就連聰明人也會上當。
12:59
Studies show that the more you hear a statement, even if you know it's false, the more believable it starts to
研究顯示,你聽到某個陳述的次數越多,即使你知道它是錯的,它也會讓你覺得
13:05
It's not logic, it's comfort.
這不是邏輯,而是舒適感。
13:08
In a world of repeated slogans, recycled headlines, and non-stop content, the
在一個充滿重複口號、回收標題和無止盡內容的世界裡,
13:11
truth doesn't always win.
真相並不總是勝利者。
13:15
Sometimes it just shows up more often.
有時候,它只是出現得更頻繁。
13:17
Hold your hand in cold water, then put it in lukewarm water.
把手放進冷水裡,然後再放進溫水裡。
13:17
Contrast effect.
對比效應。
13:22
Reverse the order and it feels cold.
反過來做,它會覺得冷。
13:22
It'll feel hot.
它會覺得熱。
13:23
You're experiencing the contrast effect.
你正在經歷對比效應。
13:26
The brain doesn't judge things on their own.
大腦不會單獨評判事物。
13:28
It compares them to whatever came right before.
它會將它們與之前剛剛接觸到的東西進行比較。
13:32
This messes with perception everywhere.
這會隨處扭曲認知。
13:34
A regular apartment looks fancy after touring a dump.
看完爛房子後,普通公寓也變豪華。
13:36
A decent job offer seems terrible if you just saw one that pays double.
如果剛看過薪水翻倍的工作,一份不錯的錄取通知也會變得很差。
13:40
Even people get rated differently depending on who they're standing next to.
就連人的評價,也會因身旁是誰而不同。
13:43
Your brain anchors to extremes.
你的大腦會以極端值為錨點。
13:47
It sets a mental baseline, then shifts expectations without telling you.
它會設定心理基準,然後在不告知的情況下轉移你的期望。
13:51
Same object, different setting, and suddenly your opinion flips.
同樣的物品,不同的情境,你的看法就會突然逆轉。
13:54
You're not judging things fairly.
你並未公平地評判事物。
13:56
Moral credential effect.
道德憑證效應。
13:57
you do one good thing like voting, donating, or speaking up and suddenly your brain gives itself a pass.
當你做了一件好事,像是投票、捐款或發聲,大腦就會突然放過自己。
14:05
The moral credential effect is when past good behavior makes people feel licensed
道德憑證效應是指,過去的良好行為會讓人覺得自己獲得許可,
14:08
to slack off or even act badly without guilt.
可以鬆懈,甚至毫無愧疚地做壞事。
14:12
Someone supports a cause once and then says, "I've done my part." Or a person speaks out against bias and later
有人支持某項公益一次,就說:「我已經盡力了。」或者有人發聲反對偏見,後來卻做出帶有偏見的決定,心想沒關係,因為自己已經證明過是公平的。
14:18
makes a biased decision thinking it's fine because they've already proven they're fair.
這就像大腦保留了道德收據,但不是用來督促自己更好,而是用來
14:23
It's like your brain keeps moral receipts, but instead of using them to be better, it uses them to
合理化更糟的行為。
14:28
justify worse.
做一次正確的事,不代表你不會犯錯,但這種偏見會讓你表現得好像
14:29
Doing the right thing once doesn't make you immune to mistakes, but this bias makes you act
真的不會犯錯一樣。
14:34
like it does.
誘餌效應。
14:36
The decoy effect.
誘餌效應
14:39
Imagine you're buying popcorn.
想像你正在買爆米花。
14:40
A large costs bucks, but then there's a medium for $6.
大份要價不斐,但中份只要 6 美元。
14:40
A small costs $3.
小份則是 3 美元。
14:45
Suddenly, the large feels like a better deal.
突然間,大份看起來更划算了。
14:48
Just $1 more for way more popcorn.
只要多花 1 美元,就能買到多很多的爆米花。
14:50
That's the decoy effect.
這就是「誘餌效應」。
14:50
When a third less attractive option is added to steer you toward a more expensive or preferred choice, it shows
當加入第三個較不具吸引力的選項,引導你選擇更貴或更偏好的選擇時,這種現象
14:57
up in everything from shopping to politics.
從購物到政治領域都看得到。
14:59
A candidate might seem more appealing when placed next to an obviously weaker one.
當候選人與明顯較弱的對手並列時,可能會顯得更有吸引力。
15:05
A product looks premium when the worst version sits right beside it.
當最差的版本就在旁邊時,產品看起來會更高級。
15:08
The decoy isn't meant to be chosen.
誘餌選項並不是要被選中的。
15:10
It's there to make the target look good by comparison.
它的存在是為了透過比較,讓目標選項看起來更好。
15:13
Your brain thinks it's making a smart, independent choice, but it's really
你的大腦以為自己正在做出聰明且獨立的選擇,但其實
15:16
reacting to a setup.
只是對預設的局做出反應。