精品无人区无码乱码毛片国产_性做久久久久久免费观看_天堂中文在线资源_7777久久亚洲中文字幕

首頁(yè) 資訊正文

譯科技『連載』:從可視化線性代數(shù)開(kāi)始機(jī)器學(xué)習(xí)(二)

  人工智能被廣泛應(yīng)用于圖像識(shí)別、語(yǔ)音識(shí)別、自然語(yǔ)言處理、智能推薦、自動(dòng)駕駛、智能制造、醫(yī)療保健等眾多領(lǐng)域,對(duì)社會(huì)、經(jīng)濟(jì)、科技的發(fā)展產(chǎn)生了深遠(yuǎn)影響。

  機(jī)器學(xué)習(xí)是人工智能的核心,是使計(jì)算機(jī)智能化的根本途徑。機(jī)器學(xué)習(xí)是一門(mén)多領(lǐng)域交叉學(xué)科,涉及多門(mén)學(xué)科,專門(mén)研究計(jì)算機(jī)怎樣模擬或?qū)崿F(xiàn)人類的學(xué)習(xí)行為,以獲取新的知識(shí)和技能,并重新組織已有知識(shí)結(jié)構(gòu)使之不斷提升自身性能。機(jī)器學(xué)習(xí)的發(fā)展已經(jīng)取得嘆為觀止的成就,徹底改變了人類對(duì)人工智能的認(rèn)知。想了解機(jī)器學(xué)習(xí),就需要先學(xué)習(xí)數(shù)學(xué)基礎(chǔ),包括線性代數(shù)、微積分、概率論等知識(shí),這些數(shù)學(xué)基礎(chǔ)對(duì)于深度學(xué)習(xí)等人工智能領(lǐng)域的理解至關(guān)重要。

  數(shù)據(jù)觀從2月27日起連載《從可視化線性代數(shù)開(kāi)始機(jī)器學(xué)習(xí)》系列文章,作者為歐洲航天局機(jī)器學(xué)習(xí)專家Marcello Politi,專業(yè)深度講述機(jī)器學(xué)習(xí)的“黑魔法”。

Visualized Linear Algebra to Get Started with Machine Learning: Part 2

從可視化線性代數(shù)開(kāi)始機(jī)器學(xué)習(xí)(二)

  Master elements of linear algebra, start with simple and visual explanations of basic concepts

  掌握線性代數(shù)要素,從基本概念的簡(jiǎn)明闡釋開(kāi)始。

Introduction

簡(jiǎn)介

  In this article, we continue the work we started in “Visualized Linear Algebra to Get Started with Machine Learning: Part 1”. We tackle new concepts of linear algebra in a simple and intuitive way. These articles are intended to introduce you to the world of linear algebra and make you understand how strongly the study of this subject and other mathematical subjects is related to data science.

  本文中,我們將延續(xù)《從可視化線性代數(shù)開(kāi)始機(jī)器學(xué)習(xí)(一)》(跳轉(zhuǎn)鏈接)的內(nèi)容,以簡(jiǎn)單直觀的方式處理線性代數(shù)的概念。本系列文章旨在介紹線性代數(shù)的世界,以線性代數(shù)為代表的一眾數(shù)學(xué)學(xué)科,與數(shù)據(jù)科學(xué)有著千絲萬(wàn)縷的聯(lián)系。

Index

索引

Solve Equations

解方程

Determinants

行列式

Advanced Changing Basis

高級(jí)變基

Eigenvalues and Eigenvectors

特征值和特征向量

Calculating Eigenvalues and Eigenvectors

計(jì)算特征值和特征向量

  Solve Equations

  解方程

  Let’s finally try to understand how to solve equations simultaneously. You will by now have become familiar with writing equations compactly using matrices and vectors, as in this example.

  最后,讓我們?cè)囍私庖幌氯绾瓮瑫r(shí)解方程。你現(xiàn)在應(yīng)該已經(jīng)熟悉了使用矩陣和向量緊湊地書(shū)寫(xiě)方程,如本例所示。

Equation (Image By Author)

方程式(圖片來(lái)自作者)

  Finding the vector of unknowns r = [a,b], is quite straightforward; we only need to multiply the left and right sides of the equation by the inverse of matrix A.

  找到未知數(shù)向量r=[a,b]是非常簡(jiǎn)單的;我們只需要將方程的左右兩邊乘以矩陣A的逆。

Solve Equation (Image By Author)

解方程(圖片來(lái)自作者)

  We see that A^-1 and A cancel, since multiplication for a matrix by its inverse always gives the identity matrix (that is, the matrix that has all 1’s on the main diagonal and zero elsewhere). And so we find the value of r.

  我們看到A^-1和A相抵消,因?yàn)榫仃嚺c它的逆數(shù)相乘總是能得到相同的矩陣(也就是主對(duì)角線上全部為1,其他地方為0的矩陣)。因此,我們找到了r的值。

  But in order to do this we have to compute A^-1 which may not be too simple. Often programming languages have algorithms already implemented that are very efficient in calculating the inverse matrix, so you will always have to use those. But in case you want to learn how to do this calculation by hand you will have to use Gaussian Elimination.

  但為了做到這一點(diǎn),我們必須計(jì)算A^-1,這可能不是太簡(jiǎn)單。通常情況下,編程語(yǔ)言已經(jīng)實(shí)現(xiàn)了在計(jì)算逆矩陣方面非常有效的算法,所以你總是不得不使用這些算法。但如果你想學(xué)習(xí)如何手工計(jì)算,你就必須使用高斯消除法。

  This is for example how you compute the inverse by using numpy in Python.

  例如,這就是你如何在Python中使用numpy來(lái)計(jì)算逆。

  Determinants

  行列式

  The determinant is another fundamental concept in line algebra. It is often taught in college how to calculate it but not what it is. We can associate a value with each matrix, and that value is precisely the determinant. However, you can think of the determinant as the area of the deformed space.

  行列式是線代中的另一個(gè)基本概念。大學(xué)里經(jīng)常教大家如何計(jì)算它,但不知道它是什么。我們可以將一個(gè)值與每個(gè)矩陣聯(lián)系起來(lái),而這個(gè)值正是行列式。然而,你可以把行列式看作是變形空間的面積。

  We have seen how each matrix is simply a deformation of space. Let us give an example.

  我們已經(jīng)看到每個(gè)矩陣是如何簡(jiǎn)單地對(duì)空間進(jìn)行變形的。讓我們舉一個(gè)例子。

Determinant (Image By Author)

行列式(圖片來(lái)自作者)

  If we calculate the area of the new space, as shown in the figure, this area is precisely the determinant associated with the starting matrix. In this case the determinant = a*d.

  如果我們計(jì)算新空間的面積,如圖所示,這個(gè)面積正好是與起始矩陣相關(guān)的行列式。在這種情況下,行列式=a*d。

  Certainly, we have matrices that can describe somewhat more complex deformations of space, and in that case, it may not be so trivial to calculate the area i.e., the determinant.

  當(dāng)然,我們有一些矩陣可以描述空間的某種更復(fù)雜的變形,在這種情況下,計(jì)算面積即行列式可能不是那么微不足道。

  For this, there are known formulas for calculating the determinant. For example, let us see the formula for calculating the determinant of a 2x2 matrix.

  為此,有一些已知的行列式計(jì)算公式。例如,讓我們看看2x2矩陣行列式的計(jì)算公式。

Compute Determinant od 2x2 Matrix (Image By Author)

計(jì)算行列式 od 2x2 矩陣(圖片來(lái)自作者)

  You can look here to learn how to calculate the determinant in general cases with larger matrices.

  這里,可以學(xué)習(xí)如何在一般情況下計(jì)算較大矩陣的行列式。

  If you think about it, however, there are transformations that do not create any area. Let’s look at the following example.

  但是,如果仔細(xì)考慮一下,有些變換并不產(chǎn)生任何面積。讓我們看一下下面的例子。

Det equal zero (Image By Author)

Det 等于零(圖片來(lái)自作者)

  In this example, the matrix does not allow us to create any area, so we have a determinant equal to zero.

  在這個(gè)例子中,矩陣不允許我們創(chuàng)建任何區(qū)域,所以行列式等于零。

  But what is the use of knowing the determinant? We have seen that to solve simultaneous equations we need to be able to calculate the inverse of a matrix.

  但知道行列式有什么用呢?我們已經(jīng)看到,為了解決同調(diào)方程,我們需要能夠計(jì)算出矩陣的逆。

  But the inverse of a matrix does not exist if the determinant is equal to zero! That is why it is important to know how to calculate it, to know if there are solutions to the problem.

  但是,如果行列式等于零,那么矩陣的逆就不存在!這就是為什么要知道如何計(jì)算它的原因。

  You can think of the inverse matrix, as a way of transforming the space back to the original space. But when a matrix causes, not an area but only a segment to be created, and then makes us go from 2d to 1d space, the inverse matrix does not have enough information and will never be able to make us go back to the original space in 2d from that in 1d.

  您可以將逆矩陣視為一種將空間轉(zhuǎn)換回原始空間的方法。但是,當(dāng)一個(gè)矩陣導(dǎo)致的不是一個(gè)區(qū)域而是一個(gè)段被創(chuàng)建,然后讓我們從二維空間回到一維空間時(shí),逆矩陣沒(méi)有足夠的信息,永遠(yuǎn)無(wú)法使我們從1D空間回到2D的原始空間。

  Advanced Changing Basis

  高級(jí)變基

  We have already seen in the previous article the basic example of changing the basis, but now let’s look at a somewhat more complex example.

  我們已經(jīng)在上一篇文章中看到了變基的例子,但現(xiàn)在讓我們看看一個(gè)有點(diǎn)復(fù)雜的例子。

  Let’s imagine the existence of two worlds, ours and Narnia’s. In our world, we use the vectors e1 and e2 as our reference vectors, as the basis. Thanks to these vectors we are able to create others and assign coordinates to them. For example, we can create the vectors [1,1], and [3,1].

  讓我們想象有兩個(gè)世界存在,我們的世界和納尼亞世界。在我們的世界里,我們使用向量e1和e2作為我們的參考向量,作為基礎(chǔ)。由于這些向量,我們能夠創(chuàng)建其他的向量并為它們分配坐標(biāo)。例如,我們可以創(chuàng)建向量[1,1]和[3,1]。

Our world (Image By Author)

我們的世界(圖片來(lái)自作者)

  In the world of Narnia though, they use different vectors as a base. Can you guess which ones they use? Just the ones we call [1,1] and [3,1].

  不過(guò)在納尼亞的世界里,他們使用不同的矢量作為基礎(chǔ)。你能猜到他們用的是哪些嗎?就是我們稱之為[1,1]和[3,1]的那些。

Narnia’s world (Image By Author)

納尼亞世界(圖片來(lái)自作者)

  The people of Narnia will then use this basis of theirs to define other vectors of space, for example, they may define the vector [3/2, 1/2].

  納尼亞人隨后將使用他們的這個(gè)基礎(chǔ)來(lái)定義其他空間向量,例如,他們可以定義向量 [3/2, 1/2]。

Vector In Narnian’s world (Image By Author)

納尼安世界中的向量(圖片來(lái)自作者)

  Well, now what I want to find out is: how do I define that red vector based on the coordinates of my world?

  那么,現(xiàn)在我想知道的是:如何根據(jù)我的世界坐標(biāo)定義紅色向量?

  We have already seen this, we take the vectors that form the basis of Narnia but expressed in the coordinates of my world, so [1,1] and [3,1]. We put them in a matrix and multiply this matrix by the red vector.

  我們已經(jīng)看到了這一點(diǎn),我們把構(gòu)成納尼亞的基礎(chǔ)的向量,用我的世界的坐標(biāo)表示,如圖[1,1]和[3,1]。我們把它們放在一個(gè)矩陣中,然后用這個(gè)矩陣乘以紅色向量。

Changing Basis (Image By Author)

變基(圖片來(lái)自作者)

  Now we ask: can we do the reverse as well? Can I express a vector of my world according to the coordinates they use in Narnia? Of course!

  現(xiàn)在我們要問(wèn):可以反過(guò)來(lái)做嗎?我可以根據(jù)他們?cè)诩{尼亞使用的坐標(biāo)表達(dá)我的世界的矢量嗎?當(dāng)然可以!

  It will suffice to do the same process but change the point of view. But why do we do all this? Very often when we have to describe vectors or transformations, we have a much simpler notation if we use a different basis.

  做同樣的過(guò)程但改變觀點(diǎn)就足夠了。但我們?yōu)槭裁匆鲞@一切?很多時(shí)候,當(dāng)我們必須描述向量或變換時(shí),如果我們使用不同的基礎(chǔ),就會(huì)有更簡(jiǎn)單的符號(hào)。

  Suppose we want to apply an R-transformation to a vector. But this transformation turns out to be difficult to apply. Then we can first transform my vector into a vector in the world of Narnia by applying the matrix N. After that we apply the desired transformation R. And then we bring everything back to our original world with N^-1.

  假設(shè)我們想對(duì)一個(gè)向量應(yīng)用R變換。但這個(gè)變換結(jié)果是很難應(yīng)用的。那么我們可以首先通過(guò)應(yīng)用矩陣N將我的向量轉(zhuǎn)化為納尼亞世界中的向量,然后我們應(yīng)用所需的變換R,然后我們用N^-1將一切帶回我們的原始世界。

  This is something that can be very useful and make life easier when we are dealing with complex transformations. I hope I have at least given you some insight; there is so much more to talk about.

  這是非常有用的東西,當(dāng)我們處理復(fù)雜的轉(zhuǎn)換時(shí),可以使生活更容易。希望我至少給了你一些啟示;還有很多東西要談。

  Eigenvalues and Eigenvectors

  特征值和特征向量

  We have already repeated several times that applying a linear transformation (a matrix) to a vector transforms that vector.

  我們已經(jīng)多次重復(fù)對(duì)向量應(yīng)用線性變換(矩陣)來(lái)變換該向量。

  However, there are cases in which the vector remains in the same initial direction. Think for example the case where we simply scale the space. If we visualize the horizontal and the vertical vector these remain in the same direction although they get longer or shorter.

  然而,在有些情況下,矢量保持在相同的初始方向。例如,考慮簡(jiǎn)單地縮放空間的情況。如果我們把水平和垂直方向的向量形象化,雖然它們變長(zhǎng)或變短,但仍保持在同一方向。

Scale Space (Image By Author)

比例空間(圖片來(lái)自作者)

  We see in the image above that the linear transformation applied here is that of scaling. But if we try to understand what happens to each individual vector we notice that the red vectors still maintain the same direction.

  在上面的圖片中看到,這里應(yīng)用的線性變換是縮放。但如果我們?cè)噲D了解每個(gè)單獨(dú)的向量會(huì)發(fā)生什么,我們會(huì)注意到紅色的向量仍然保持相同的方向。

  These vectors that maintain the same direction are called Eigenvectors of the matrix that described this transformation.

  這些保持相同方向的向量被稱為描述這一轉(zhuǎn)換的矩陣的特征向量。

  In particular, the vertical red vector has remained unchanged, so let’s say it has eigenvalue =1 while the other red vector has doubled so let’s say it has eigenvalue =2.

  尤其是,垂直的紅色向量保持不變,假設(shè)它的特征值 = 1,而另一個(gè)紅色向量增加了一倍,所以我們說(shuō)它的特征值=2。

  Obviously depending on the matrix, and thus the transformation, the number of eigenvectors may vary.

  顯然,根據(jù)矩陣的不同,也就意味著轉(zhuǎn)換的不同,特征向量的數(shù)量可能會(huì)有所不同。

  Calculating Eigenvalues and Eigenvectors

  計(jì)算特征值和特征向量

  Let us now try to convert what we have expressed in words into a mathematical formula. So eigenvectors are those vectors to which when a matrix is applied they do not change, at most they lengthen or shorten.

  現(xiàn)在讓我們?cè)囍延梦淖直磉_(dá)的東西轉(zhuǎn)換成數(shù)學(xué)公式。因此,特征向量是那些當(dāng)矩陣被應(yīng)用時(shí),它們不會(huì)發(fā)生變化的向量,頂多是延長(zhǎng)或縮短。

Calculate Eigenvectors (Image By Author)

計(jì)算特征向量(圖片來(lái)自作者)

  In the formula A is a matrix, x is a vector and lambda is a scalar. If the condition is satisfied we say that x is an eigenvector of A with the corresponding eigenvalue lambda.

  公式中A是一個(gè)矩陣,x是一個(gè)向量,lambda是一個(gè)標(biāo)量。如果條件得到滿足,我們就說(shuō)x是A的一個(gè)特征向量,其對(duì)應(yīng)的特征值為lambda。

  By solving the previous equation we can find the value of the eigenvalues that solve the equation, let’s see how to do it.

  通過(guò)解決前面的方程,我們可以找到解決方程的特征值的值,讓我們看看如何做。

Characteristic polynomial (Image By Author=

特征多項(xiàng)式(圖片來(lái)自作者)

  Once the eigenvalues have been found, it will suffice to substitute them into the following equation to find the eigenvectors.

  一旦找到特征值,就可以將它們代入以下等式以找到特征向量。

Find eigenvectors (Image By Author)

查找特征向量(圖片來(lái)自作者)

  Final Thoughts

  編后按

  I hope you have found some useful insights in this article and that you have understood them without too much effort. The purpose is to get a little familiar with these terms and linear algebra elements. In this way, I hope that the next time you go to look at the documentation of sklearn or some library you will be able to better understand what that particulate function you are using is actually doing!

  希望你從本文中獲得有用見(jiàn)解,不費(fèi)吹灰之力就能理解它們,其目的是讓你熟悉這些術(shù)語(yǔ)和線性代數(shù)元素。通過(guò)這種方式,下次你再去看sklearn或一些庫(kù)的文檔時(shí),就能更好地理解你正在使用的那個(gè)粒子函數(shù)實(shí)際上在做什么!

責(zé)任編輯:張薇

分享: