Mock Test Mitra™
📖 Loading…
0%
0 sections · 0 topics · 0 min reading · 0 sessions

Engineering Mathematics – Complete Study Notes

ESE / IES GATE SSC JE

Comprehensive chapter-wise notes covering all 6 topics of Engineering Mathematics for IES ESE Paper I — linear algebra, calculus, differential equations, complex variables, probability & statistics, and numerical methods. All key formulae, proofs, worked examples, and exam tips included.

Ch 1 · Linear Algebra Ch 2 · Calculus Ch 3 · Differential Equations Ch 4 · Complex Variables Ch 5 · Probability & Statistics Ch 6 · Numerical Methods ★ Quick Revision
1Linear Algebra

1.1 Matrices — Key Definitions

TermDefinition / Property
Rank of a matrixNumber of linearly independent rows (or columns); = number of non-zero rows in row echelon form
Singular matrixdet(A) = 0; rank < n; system Ax = b may have no or infinite solutions
SymmetricA = Aᵀ
Skew-symmetricA = −Aᵀ; diagonal elements = 0
OrthogonalAAᵀ = I; Aᵀ = A⁻¹; columns are orthonormal
IdempotentA² = A
NilpotentAᵏ = 0 for some integer k

1.2 Determinants

det(AB) = det(A) · det(B)
det(Aᵀ) = det(A)
det(kA) = kⁿ · det(A) for n×n matrix
det(A⁻¹) = 1 / det(A)
If two rows are identical or proportional → det = 0
Row/column swap → sign of det changes

1.3 System of Linear Equations

Ax = b; augmented matrix [A|b]
ρ(A) = rank of A; ρ([A|b]) = rank of augmented matrix

Consistency (Rouché–Capelli theorem):
ρ(A) ≠ ρ([A|b]) → No solution (inconsistent)
ρ(A) = ρ([A|b]) = n → Unique solution
ρ(A) = ρ([A|b]) < n → Infinitely many solutions (n − ρ free variables)

1.4 Eigenvalues and Eigenvectors

Ax = λx; characteristic equation: det(A − λI) = 0

Properties of eigenvalues:
Sum of eigenvalues = trace(A) = Σa_ii
Product of eigenvalues = det(A)
Eigenvalues of Aᵀ = eigenvalues of A
Eigenvalues of A⁻¹ = 1/λ (if A invertible)
Eigenvalues of Aᵏ = λᵏ
Symmetric matrix → all eigenvalues are real
Skew-symmetric → eigenvalues are zero or purely imaginary
Orthogonal matrix → |λ| = 1
📝 ESE Tip: Rank-nullity theorem: rank(A) + nullity(A) = n (number of columns). Cayley–Hamilton theorem: every matrix satisfies its own characteristic equation — A² = (trace)A − (det)I for 2×2.
2Calculus

2.1 Limits and Continuity

L'Hôpital's rule: if lim f/g → 0/0 or ∞/∞, then lim f/g = lim f'/g'
Continuity at x=a: lim(x→a⁻) f = lim(x→a⁺) f = f(a)
Differentiability ⟹ Continuity (converse not true)

2.2 Differentiation

Chain rule: d/dx[f(g(x))] = f'(g(x)) · g'(x)
Product rule: (uv)' = u'v + uv'
Quotient rule: (u/v)' = (u'v − uv') / v²
Leibniz rule (nth derivative of product): (uv)⁽ⁿ⁾ = Σ C(n,k) u⁽ᵏ⁾ v⁽ⁿ⁻ᵏ⁾

2.3 Integration

Integration by parts: ∫u dv = uv − ∫v du (ILATE order: Inverse, Log, Algebraic, Trig, Exp)
Definite integral properties:
∫₋ₐᵃ f(x)dx = 0 if f is odd; = 2∫₀ᵃ f(x)dx if f is even
Gamma function: Γ(n) = (n−1)! for integer n; Γ(1/2) = √π
Beta function: B(m,n) = Γ(m)Γ(n)/Γ(m+n)

2.4 Partial Derivatives and Multivariable Calculus

Total differential: df = (∂f/∂x)dx + (∂f/∂y)dy
Euler's theorem for homogeneous functions of degree n: x(∂f/∂x) + y(∂f/∂y) = nf
Maxima/minima test (two variables): D = f_xx·f_yy − f_xy²
D > 0, f_xx > 0 → min; D > 0, f_xx < 0 → max; D < 0 → saddle point
Jacobian: J = |∂(u,v)/∂(x,y)| (used in change of variables for double integrals)
📝 ESE Tip: Euler's theorem for homogeneous functions appears every few years. Maxima–minima via the second-derivative test (D test) is a high-yield topic. Gamma/Beta function values to remember: Γ(1/2) = √π, Γ(n+1) = nΓ(n).
3Differential Equations

3.1 First-Order ODEs

Separable: f(y)dy = g(x)dx → integrate both sides
Linear: dy/dx + P(x)y = Q(x) → integrating factor μ = e^∫P dx → solution: yμ = ∫Qμ dx
Exact: M dx + N dy = 0 is exact if ∂M/∂y = ∂N/∂x → solution F(x,y) = c where ∂F/∂x=M, ∂F/∂y=N
Bernoulli: dy/dx + P(x)y = Q(x)yⁿ → substitute v = y^(1−n) → reduces to linear

3.2 Second-Order Linear ODEs with Constant Coefficients

ay'' + by' + cy = f(x); auxiliary equation: am² + bm + c = 0

Complementary solution y_c based on roots m₁, m₂:
Real distinct: y_c = C₁e^(m₁x) + C₂e^(m₂x)
Real repeated: y_c = (C₁ + C₂x)e^(mx)
Complex conjugate m = α ± iβ: y_c = e^(αx)[C₁cos(βx) + C₂sin(βx)]

Particular solution y_p by method of undetermined coefficients or variation of parameters
General solution: y = y_c + y_p

3.3 Laplace Transforms

f(t)F(s) = L{f(t)}
11/s
t1/s²
tⁿn!/s^(n+1)
e^(at)1/(s−a)
sin(at)a/(s²+a²)
cos(at)s/(s²+a²)
e^(at)f(t)F(s−a) [First shifting theorem]
f(t−a)u(t−a)e^(−as)F(s) [Second shifting theorem]
f'(t)sF(s) − f(0)
f''(t)s²F(s) − sf(0) − f'(0)
📝 ESE Tip: Laplace transform pairs and the derivative property are heavily tested. For IVP (initial value problems), take Laplace, solve for F(s), then use inverse Laplace (partial fractions) to get y(t).
4Complex Variables

4.1 Analytic Functions

f(z) = u(x,y) + iv(x,y); z = x + iy
Cauchy–Riemann equations (necessary for analyticity):
∂u/∂x = ∂v/∂y and ∂u/∂y = −∂v/∂x

If C–R equations hold AND partial derivatives are continuous → f is analytic
Analytic function: u and v are both harmonic (∇²u = 0, ∇²v = 0)

4.2 Cauchy's Integral Theorem and Formula

Cauchy's theorem: if f(z) is analytic inside and on closed contour C → ∮_C f(z)dz = 0

Cauchy's integral formula:
f(a) = (1/2πi) ∮_C f(z)/(z−a) dz (a inside C)
f^(n)(a) = (n!/2πi) ∮_C f(z)/(z−a)^(n+1) dz

4.3 Residues and Poles

Pole of order m at z = a: (z−a)^m f(z) remains finite and non-zero as z→a
Simple pole (m=1): Res[f,a] = lim(z→a) (z−a)f(z)
Pole of order m: Res[f,a] = (1/(m−1)!) × lim(z→a) d^(m−1)/dz^(m−1) [(z−a)^m f(z)]

Residue theorem: ∮_C f(z)dz = 2πi × Σ(residues inside C)
📝 ESE Tip: Cauchy–Riemann equations and the residue theorem for contour integrals are the two most-tested topics. Remember: simple pole residue = lim(z→a)(z−a)f(z) — quick and clean.
5Probability & Statistics

5.1 Basic Probability

Bayes' theorem: P(A|B) = P(B|A)·P(A) / P(B)
Total probability: P(B) = Σ P(B|Aᵢ)·P(Aᵢ)
Independent events: P(A∩B) = P(A)·P(B)
Mutually exclusive: P(A∪B) = P(A) + P(B)

5.2 Standard Distributions

DistributionPMF / PDFMeanVariance
Binomial B(n,p)C(n,k)p^k(1−p)^(n−k)npnp(1−p)
Poisson P(λ)e^(−λ)λ^k/k!λλ
Normal N(μ,σ²)(1/σ√2π)e^(−(x−μ)²/2σ²)μσ²
Exponential Exp(λ)λe^(−λx), x≥01/λ1/λ²
Uniform U(a,b)1/(b−a)(a+b)/2(b−a)²/12

5.3 Statistics

Sample mean: x̄ = (1/n)Σxᵢ
Sample variance: s² = Σ(xᵢ−x̄)² / (n−1) [unbiased]
Standard error of mean: SE = σ/√n

Hypothesis testing:
H₀ = null hypothesis; H₁ = alternative
Type I error (α): reject H₀ when it is true (significance level)
Type II error (β): accept H₀ when H₁ is true; Power = 1−β

Z-test (known σ): Z = (x̄ − μ₀)/(σ/√n)
t-test (unknown σ): t = (x̄ − μ₀)/(s/√n) with n−1 degrees of freedom
📝 ESE Tip: Poisson distribution (mean = variance = λ) is uniquely identified. Normal distribution Z-score and confidence intervals are tested often. Bayes' theorem conditional probability numericals appear in most ESE papers.
6Numerical Methods

6.1 Root Finding

Newton–Raphson: x_(n+1) = x_n − f(x_n)/f'(x_n)
Convergence: second order (quadratic) — errors square each step
Fails when f'(x_n) ≈ 0 (near inflection or flat region)

Bisection method: bracket [a,b] with f(a)·f(b) < 0; halve interval each step
Convergence: first order (linear); guaranteed but slow

Secant method: replaces derivative with finite difference; super-linear convergence

6.2 Numerical Integration

Trapezoidal rule: ∫f dx ≈ h/2 [f₀ + 2f₁ + 2f₂ + … + 2f_(n−1) + f_n]
Error: O(h²) per step; O(h²) overall

Simpson's 1/3 rule (n even): ∫f dx ≈ h/3 [f₀ + 4f₁ + 2f₂ + 4f₃ + … + 4f_(n−1) + f_n]
Error: O(h⁴); more accurate than trapezoidal

Simpson's 3/8 rule (n multiple of 3): ∫f dx ≈ 3h/8 [f₀ + 3f₁ + 3f₂ + 2f₃ + … + f_n]

6.3 Numerical ODEs

Euler's method: y_(n+1) = y_n + h·f(x_n, y_n); Error: O(h)

Runge–Kutta 4th order (RK4):
k₁ = h·f(xₙ, yₙ)
k₂ = h·f(xₙ + h/2, yₙ + k₁/2)
k₃ = h·f(xₙ + h/2, yₙ + k₂/2)
k₄ = h·f(xₙ + h, yₙ + k₃)
y_(n+1) = yₙ + (k₁ + 2k₂ + 2k₃ + k₄)/6; Error: O(h⁴)
📝 ESE Tip: Newton–Raphson root finding and Simpson's 1/3 rule are the most tested numerical methods. RK4 formula — especially the weighted average (1:2:2:1)/6 — must be memorised. Simpson requires an even number of intervals.
Quick Revision
TopicKey Formula / Fact
EigenvaluesΣλ = trace(A); Πλ = det(A)
System consistencyρ(A) = ρ([A|b]) ↔ consistent
Euler's theoremx·∂f/∂x + y·∂f/∂y = nf (degree n)
Laplace: e^(at)1/(s−a)
Laplace: sin(at)a/(s²+a²)
Cauchy–Riemann∂u/∂x = ∂v/∂y; ∂u/∂y = −∂v/∂x
Poisson distributionMean = Variance = λ
Normal Z-scoreZ = (x − μ)/σ
Newton–Raphsonx_(n+1) = x_n − f(x_n)/f'(x_n)
Simpson's 1/3h/3[f₀ + 4f₁ + 2f₂ + … + 4f_(n−1) + f_n]
RK4 weights(k₁ + 2k₂ + 2k₃ + k₄)/6