389Realtime Video on the iPhone from a Local Server

Following Problem: How do stream video from a local machine to an iPhone OS device in real time? Real-time is an absolutely necessary. That basically rules out the HTTP Streaming Server (as described previously), as it’s capable of streaming to multiple clients, but has to sacrifice real time for that. I looked a bit into sending the video packets via UDP, but that would create the challenge of having to know the IP address of the device. Not to mention the overhead of re-assembling the packets into video frames and keeping them somewhat synced. And althought the max. possible size of UDP datagramms is about 65K, the wireless router needs to be configures to handle jumbo-sized packets as well. Hmmm. There must be a simpler way… As this is only on a local network, and only intended for one recipent, I was wondering, whether the network speed would be high enough to deliver 25fps via a plan Apache server..? I made a simple Max patch, that would capture the video and write a jpeg image every 40ms, this should give us 25fps. Of course this can be done with any other software as well. If you think that might be to much strain on the harddisk, have a look on how to set up a RAM Disk on arbitraty mountpoint, for example in your local webserver directory. So we have our webserver, our live-update image: 192.168.20.117/live.jpg Viewing it in a browser an refreshing it should give you updated images. So far so good. (Of course, the address is my local address, yours might vary.) Now, basically all we have to do is to make a simple iPhone Appli, ask it to reload the image periodically and display the image. – create a NSURLConnection and point it to the image address – collect the data, and display it once the download is completed – download the image again after 40ms. What can I see, I am surprised it works quite well. Clearly there are some troubles: about 1% of the returned images feature not data, which would result in a flicker on the iPhone… A simple test, whether the received [data length] > 0 takes care of that problem. I am guessing, that this is caused by a collision of the write time of the max patch and the download time of the iPhone application, but I am not 100% sure. If anyone knows the answer, let me know. And, yeah, I should really make a videp demo-ing that… Anywhere, here are the juicy bits of the code:
- (void)loadView {
	[super loadView];
	NSLog(@"start");
		
	reloadImageView = [[UIImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, 480, 320)];
	reloadImageView.center = CGPointMake(160.0, 240.0);
	reloadImageView.transform = CGAffineTransformMakeRotation(M_PI/2);

	[self.view addSubview:reloadImageView];

	self.view.backgroundColor = [UIColor blueColor];
	
	[reloadImageView release];
	
	struct timeval tv;
	gettimeofday(&tv, NULL);
	diff = tv.tv_sec;
	
	[self downloadImage];
}

- (void)downloadImage {
	startTimestamp = [self currentTime];

	NSString *address = @"http://192.168.20.117/live.jpg";

	NSURLRequest *request = [NSURLRequest requestWithURL:[NSURL URLWithString:address]];
	[NSURLConnection connectionWithRequest:request delegate:self];
}


- (void)connection:(NSURLConnection *)connection didFailWithError:(NSError *)error {
	downloadErrors++;
	[self downloadImage];
}

- (void)connection:(NSURLConnection *)connection didReceiveResponse:(NSURLResponse *)response {
	//NSLog(@"didReceiveResponse");
	data = nil;													// clear
	[data release];												// release
	data = [[NSMutableData alloc] initWithCapacity:20000];		// create & retain
	
}

- (void)connection:(NSURLConnection *)connection didReceiveData:(NSData *)partialData {
	//NSLog(@"didReceiveData");
	[data appendData: partialData];
}

- (void)connectionDidFinishLoading:(NSURLConnection *)connection {
	//NSLog(@"connectionDidFinishLoading");
	
	float difference = (([self currentTime] - startTimestamp) - 0.040);
	if ([data length] > 0) {
		// check if data has content (displaying an empty data 
		// images shows nothing, but flashes the background)
		reloadImageView.image = [UIImage imageWithData:data];
		imagesDownloaded++;
		totalBytes += [data length];
	} else {
		zeroBytes++;
	}

	if (difference < 0) {
		[self performSelector:@selector(downloadImage) withObject:nil afterDelay:fabs(difference)];
	} else {
		[self downloadImage]; // loop
	}

}

- (float)currentTime {
	struct timeval tv;
	gettimeofday(&tv, NULL);
	float msec = tv.tv_usec / 1000000.0;
	float current = (tv.tv_sec - diff) + msec;
	return current;
}

And here is the whole demo Xcode project. Make sure to change the image address to your local image. I also include a simple Max-patch, that saves images from a camera to disk. Update Although the initial diff timer was in the sniplet here, it did not make it in the original Xcode project. Mistake corrected, now the Xcode project should reflect the code here.