Web3D using software rendering
![Image](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgrBdalG1Vtvmr_BKUTvB13vN6XIaKgVF92S_hUZzk-olXOGN9IWzAILXLkPQBP7qfcSiUpvYY6-MKAyxVXphaOhqyEG1LbPmTCBQyFQD66QoSaUt5TAxN4VVhcIASuDUzU2Vo7i0crGRl9/w400-h386/software_earth.jpg)
Nowadays, if people would like to implement a Web3D demo, most of them would use WebGL approach. WebGL seems to be the spec of Web3D, because Chrome, Firefox, Safari, IE, and Android Chrome have supported it. iOS 8 will be launched this year, and WebGL also can be executed on iOS 8 Safari. In order to support iOS 7 pervious version for Web3D, I survey the Software Renderer method of three.js and help them add texture mapping and pixel lighting. First, we get the image data from canvas. var context = canvas.getContext( '2d', { alpha: parameters.alpha === true } ); imagedata = context.getImageData( 0, 0, canvasWidth, canvasHeight ); data = imagedata.data; Second, we have to project the faces of objects to the screen space, we needn't sort them by painter algorithm, because three.js has implemented a screen size z buffer to store the depth values for depth testing. And then, start to interpolate the pixels of these faces. var dz12 = z1 - z2