I stumbled across an intersting post about the state of Social sharing [update: 2017, reference lost]

Instagram and Twitter were great services once. But then along came reality and blew that house in. Instagram is now as slow as a donkey ready to become a can of cat food while Twitter is as bloated as a hippo at christmas.

What Twitter lacks is what Instagram had, the simplicity of sharing visual images. What Instagram lacks is the simplicity of communication which was the foundation of Twitter.

This is a post about Pic-stagram. A weekend project to blend the best about Instagram and Twitter.

You can view the test results of this mini project at the Pics app

The basic premise of this project is to post pics directly to Twitter without any modification only if the tweet conforms to Twitters 140 character limitation.

If the tweet exceeds that limitation the tweet is ellipsed and an alternative URL is provided by my self hosted cloud app. Commenting pics by other users is handled within the Twitter service while the “liking” of pics is a custom function via my self hosted cloud app.

As you can see, it relies on two services Twitter and my cloud app.

app (photo) -> Twitter <- comments/replies, retweets, quotes
app (metadata) -> cloud app <- likes, URL to Twitter post

Recently I have been lucky enough to assist the amazing people at one of the world’s best app companies, build a folding view controller for the iOS platform. It’s based on the excellent ECSlidingViewController and XYOrigami projects. It’s an excellent addition to this project and provides the basis of the menu system.

As for the Twitter part of the problem I’m reusing a Twitter Singleton I wrote for another weekend project.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
/////////////////////
// TwitterSingleton.h
/////////////////////
@protocol TwitterSingletonDelegate;
@interface TwitterSingleton : NSObject
@property (strong, nonatomic) ACAccountStore *accountStore;
@property (strong, nonatomic) ACAccountType *accountTypeTwitter;
@property (strong, nonatomic) ACAccount *account;
@property (strong, nonatomic) NSArray *arrayOfAccounts;
@property (strong, nonatomic) NSString *twitterUsername;
@property (assign) iddelegate;
+ (id)accountsShared;
- (BOOL)isTwitterAvailable;
- (BOOL)isTwitterSetup;
- (void)postStatus:(NSString *)tweetText withPic:(UIImage *)image;
- (void)getUserInfo:(id)userId;
- (void)getCurrentUserInfo;
@end
@protocol TwitterSingletonDelegate
@optional
- (void)twitterStatusPosted:(id)tweet;
@required
- (void)twitterSingletonError:(int)code msg:(NSString *)error;
- (void)currentUserInfo:(id)info;
@end

Basically all I’m interested in is ensuring: a) I can use the iOS Twitter Framework, b) find all the Twitter accounts on the users iOS device, c) be able to grab user information for a specific account, and d) post a pic.

The TwitterSingletonDelegate provides some useful information elsewhere in the app but it’s not a requirement. The only part worth noting in this class is the postStatus method which fires our content to Twitter.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
/////////////////////
// TwitterSingleton.m
/////////////////////
- (void)postStatus:(NSString *)tweetText withPic:(UIImage *)image
{
if (_accountStore == nil)
{
self.accountStore = [[ACAccountStore alloc] init];
}
if (_arrayOfAccounts == nil)
{
_accountTypeTwitter = [self.accountStore accountTypeWithAccountTypeIdentifier:ACAccountTypeIdentifierTwitter];
}
[self.accountStore requestAccessToAccountsWithType:self.accountTypeTwitter withCompletionHandler:^(BOOL granted, NSError *error)
{
if(granted)
{
_arrayOfAccounts = [self.accountStore accountsWithAccountType:self.accountTypeTwitter];
if ([self.arrayOfAccounts count] &gt; 0)
{
ACAccount *tempAccount = nil;
for (ACAccount *account in self.arrayOfAccounts)
{
if ([self.twitterUsername isEqual:[account username]])
{
tempAccount = account;
}
}
// Build a twitter request
TWRequest *postRequest = [[TWRequest alloc] initWithURL:[NSURL URLWithString:@"https://upload.twitter.com/1/statuses/update_with_media.json"]
parameters:nil
requestMethod:TWRequestMethodPOST];
[postRequest addMultiPartData:[tweetText dataUsingEncoding:NSUTF8StringEncoding]
withName:@"status"
type:@"multipart/form-data"];
[postRequest addMultiPartData:UIImageJPEGRepresentation(image, 0.9)
withName:@"media"
type:@"multipart/png"];
// Post the request
postRequest.account = tempAccount;
[postRequest performRequestWithHandler:^(NSData *responseData, NSHTTPURLResponse *urlResponse, NSError *error)
{
if (responseData)
{
if (200 == [urlResponse statusCode])
{
NSError *jsonError;
id timeline = [NSJSONSerialization JSONObjectWithData:responseData
options:NSJSONReadingMutableLeaves
error:&amp;jsonError];
if (timeline)
{
[delegate twitterStatusPosted:timeline];
} else {
// Inspect the contents of jsonError
[delegate twitterSingletonError:TS_JSON_EMPTY msg:@"Failed to access data"];
}
}
if (error)
{
// TODO
// Handle errors
}
}
else if (403 == [urlResponse statusCode])
{
// NOTE
// https://dev.twitter.com/discussions/3891
[delegate twitterSingletonError:TS_RATE_LIMIT_EXCEEDED msg:@"Rate limit was exceeded"];
}
}];
}
}
}];
}

Sending a pic is done via a multipart/form POST which is surprisingly easy in iOS5.

With the transport to Twitter sorted, the next piece of the puzzle is taking a snapshot. After exploring a few options, which usually means writing stuff from scratch, I’ve settled on using Brad Larson’s GPUImage project. It’s simple to use and offers a bit of future proofing with its ability to process video as well as static photos.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
/////////////////////////
// CameraViewController.h
/////////////////////////
#import "TweetViewController.h"
typedef enum {
GPUIMAGE_SEPIA,
GPUIMAGE_GRAYSCALE,
GPUIMAGE_PIXELLATE,
GPUIMAGE_VIGNETTE,
GPUIMAGE_RGB,
GPUIMAGE_NUMFILTERS
} GPUImageFilterType;
@interface CameraViewController : UIViewController
{
GPUImageStillCamera *stillCamera;
GPUImageOutput *filter;
GPUImageOutput *mainFilter;
GPUImagePicture *sourcePicture;
GPUImageView *filterView;
GPUImageFilterType filterType;
UIButton *cameraButton;
}
@property (nonatomic, retain) TweetViewController *tweet;
@end

GPUImage offers a heck of a lot of pre-built filters. I’ve opted to limit this project to just 5, because there really isn’t any point in spending a great deal of time playing around with filters for such a simple project. If you really want to expand your horizons with the GPUImage framework then check out Perlin Noise on GPU in GPUImage for a heads up on what’s achievable.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
/////////////////////////
// CameraViewController.m
/////////////////////////
- (void)setupCamera;
{
if (backCamera)
{
stillCamera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480
cameraPosition:AVCaptureDevicePositionBack];
}
else
{
stillCamera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480
cameraPosition:AVCaptureDevicePositionFront];
}
stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
filter = [[GPUImageGammaFilter alloc] init];
[stillCamera addTarget:filter];
filterView = [[GPUImageView alloc] initWithFrame:self.view.bounds];
[self.view addSubview:filterView];
[filter addTarget:filterView];
[stillCamera startCameraCapture];
}

As you can see setting up the still camera with GPUImage is ridiculously easy but if you want to have the ability to change filters while having a live view of your subject, things get a tad complicated. Lucky for you, I’m in the rambling mood.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
/////////////////////////
// CameraViewController.m
/////////////////////////
- (void)updateCameraFilters
{
[stillCamera removeAllTargets];
[stillCamera pauseCameraCapture];
[filter removeAllTargets];
[mainFilter removeAllTargets];
// https://github.com/BradLarson/GPUImage/issues/112
filter = [[GPUImageFilterGroup alloc] init];
GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
GPUImageContrastFilter *contrastFilter = [[GPUImageContrastFilter alloc] init];
GPUImageBrightnessFilter *brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
switch (selectedFilter) {
case GPUIMAGE_SEPIA:
mainFilter = [[GPUImageSepiaFilter alloc] init];
[(GPUImageSepiaFilter *)mainFilter setIntensity:1.0];
break;
case GPUIMAGE_GRAYSCALE:
mainFilter = [[GPUImageGrayscaleFilter alloc] init];
break;
case GPUIMAGE_PIXELLATE:
mainFilter = [[GPUImagePixellateFilter alloc] init];
[(GPUImagePixellateFilter *)mainFilter setFractionalWidthOfAPixel:0.02];
break;
default:
mainFilter = [[GPUImageRGBFilter alloc] init];
break;
}
[(GPUImageFilterGroup *)filter addTarget:mainFilter];
[(GPUImageFilterGroup *)filter addTarget:saturationFilter];
[(GPUImageFilterGroup *)filter addTarget:contrastFilter];
[(GPUImageFilterGroup *)filter addTarget:brightnessFilter];
if (filterSelected)
{
[contrastFilter setContrast:initialContrast];
[saturationFilter setSaturation:initialSaturation];
[brightnessFilter setBrightness:initialBrightness];
[self.contrastFilterSettingsSlider setValue:initialContrast];
[self.saturationFilterSettingsSlider setValue:initialSaturation];
[self.brightnessFilterSettingsSlider setValue:initialBrightness];
filterSelected = NO;
normalFilter = NO;
}
else
{
[contrastFilter setContrast:contrast];
[saturationFilter setSaturation:saturation];
[brightnessFilter setBrightness:brightness];
}
// https://github.com/BradLarson/GPUImage/issues/130
[mainFilter prepareForImageCapture];
[saturationFilter prepareForImageCapture];
[contrastFilter prepareForImageCapture];
[brightnessFilter prepareForImageCapture];
[mainFilter addTarget:saturationFilter];
[saturationFilter addTarget:contrastFilter];
[contrastFilter addTarget:brightnessFilter];
[(GPUImageFilterGroup *)filter setInitialFilters:[NSArray arrayWithObjects:mainFilter, nil]];
[(GPUImageFilterGroup *)filter setTerminalFilter:brightnessFilter];
[stillCamera addTarget:filter];
[filter addTarget:filterView];
// https://github.com/BradLarson/GPUImage/issues/99
[filter prepareForImageCapture];
[stillCamera resumeCameraCapture];
}

As you can see, it’s a bit of a mission. In my infinite wisdom I added a few additional filters to allow a bit of fine tuning of the live view before capture. In hindsight, it wasn’t the brightest idea but it was an opportunity to learn more about GPUImage. So in the header we add the following and @synthesize them in the class implementation file.

1
2
3
4
5
6
/////////////////////////
// CameraViewController.h
/////////////////////////
@property(readwrite, unsafe_unretained, nonatomic) IBOutlet UISlider *contrastFilterSettingsSlider;
@property(readwrite, unsafe_unretained, nonatomic) IBOutlet UISlider *saturationFilterSettingsSlider;
@property(readwrite, unsafe_unretained, nonatomic) IBOutlet UISlider *brightnessFilterSettingsSlider;

Capturing the UISlider values are done via IBAction’s

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
/////////////////////////
// CameraViewController.m
/////////////////////////
- (IBAction)updateFilterFromContrastSlider:(id)sender;
{
contrast = [(UISlider *)sender value];
[self updateCameraFilters];
}
- (IBAction)updateFilterFromSaturationSlider:(id)sender;
{
saturation = [(UISlider *)sender value];
[self updateCameraFilters];
}
- (IBAction)updateFilterFromBrightnessSlider:(id)sender;
{
brightness = [(UISlider *)sender value];
[self updateCameraFilters];
}

The only left to do now is capture and save the image from the camera.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
/////////////////////////
// CameraViewController.m
/////////////////////////
- (void)captureImage
{
[cameraButton setEnabled:NO];
[stillCamera capturePhotoAsPNGProcessedUpToFilter:filter withCompletionHandler:^(NSData *processedPNG, NSError *captureError)
{
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageDataToSavedPhotosAlbum:processedPNG metadata:nil completionBlock:^(NSURL *assetURL, NSError *albumError)
{
// Ensure we are responsive
runOnMainQueueWithoutDeadlocking(^{
self.tweet = [[TweetViewController alloc] initWithFileURL:assetURL];
[[self navigationController] pushViewController:self.tweet animated:YES];
});
}];
}];
}

If you wanted to crop the image as a square and add a picture frame, like Instagram, then all you need to do is change the captureImage method above to this:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
/////////////////////////
// CameraViewController.m
/////////////////////////
- (void)captureImage
{
[cameraButton setEnabled:NO];
[stillCamera capturePhotoAsPNGProcessedUpToFilter:filter withCompletionHandler:^(NSData *processedPNG, NSError *captureError)
{
// Add a picture frame and crop
UIImage *image = nil;
UIImage *processedImage = [UIImage imageWithData:processedPNG];
UIImage *imageMask = [UIImage imageNamed:@"filter-border-black.png"];
image = [UIImage layerImage:[processedImage imageByScalingAndCroppingForSize:imageMask.size] under:imageMask];
NSData *imageData = UIImagePNGRepresentation(image);
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageDataToSavedPhotosAlbum:imageData metadata:nil completionBlock:^(NSURL *assetURL, NSError *albumError)
{
// Ensure we are responsive
runOnMainQueueWithoutDeadlocking(^{
self.tweet = [[TweetViewController alloc] initWithFileURL:assetURL];
[[self navigationController] pushViewController:self.tweet animated:YES];
});
}];
}];
}

Now that the image is captured and processed they way I like it, we jump over to constructing a message to send to Twitter.

Most people on planet earth using the Internet will be aware that Twitter has a limitation of characters based on SMS text messages. The problem with that is that Instagram doesn’t impose such a limitation. I decided very early in the design stage to allow the TweetViewController class to cater for both Twitter and people who like to ramble.

The first thing we do is instantiate our Twitter singleton class and get the current selected account’s user information. Then we grab the captured image we stored previously.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
/////////////////////////
// TweetViewController.m
/////////////////////////
- (void)viewDidLoad
{
[super viewDidLoad];
accounts = [[TwitterSingleton alloc] init];
accounts.delegate = self;
[accounts getCurrentUserInfo];
[self loadImageFromStore];
}
- (void)loadImageFromStore
{
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
ALAssetRepresentation *rep = [myasset defaultRepresentation];
CGImageRef iref = [rep fullResolutionImage];
if (iref)
{
UIImage *largeimage = [UIImage imageWithCGImage:iref];
[self hiresImageAvailable:largeimage];
}
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
// TODO
// Inform the user that the image was unable to be attached to the tweet
// Pop view to root
};
ALAssetsLibrary* assetslibrary = [[ALAssetsLibrary alloc] init];
[assetslibrary assetForURL:self.imagePath
resultBlock:resultblock
failureBlock:failureblock];
}

Because grabbing the image from the Assets Library can be quite time consuming we’re using blocks here.

[update 2017, I got bored written this. But as you can see at https://twitter.com/picsapp it worked well]

Want to work with me?

Let's Talk!

Recently I've worked with ...