Let f(x) be defined on an interval I=(-1,1) and suppose that f(c)≠0 at some c∈I where f is continuous.
Use the ‘ε-δ argument’ to show that there is an interval (c-δ,c+δ)⊂I about c such that f(x)f(c)>0 for every x∈(c-δ,c+δ).
채팅
카테고리
-
최신 글
글 목록
그 밖의 기능
math