show that 0 < f(x) -x < c for every 0 < x < 1
Source:
October 5, 2010
algebra unsolvedalgebra
Problem Statement
Every x,0≤x≤1, admits a unique representation x=∑j=0∞aj2−j, where all the aj belong to {0,1} and infinitely many of them are 0. If b(0)=2+c1+c,b(1)=2+c1,c>0, and
f(x)=a0+j=0∑∞b(a0)⋯b(aj)aj+1
show that 0<f(x)−x<c for every x,0<x<1.