How to auto publish post on Facebook Fan Page as admin using Facebook PHP SDK V4 Graph API v2.2 (2) - Get Permanent Page Access Token

Graph API Explorer

There are various types of access token for facebook app. The best article that explains them is How to handle access tokens with Facebook SDK v4.0 by Sammy Kaye. To post article / item / product from our own website to facebook fan page, we need to use Page Access Token.

The initial token can be obtained from Facebook Graph API tool. However, that is just a short-lived access token, which will expire in an hour or two after generation. We need to made a Graph API call to get a long-lived access token, which has two months expiry date. Lastly, get a permanent access token, which will never expire and can be used in our daily script.

We can test our script using short-lived access token, but then you have to regenerate another token after expiration. So we should directly get permanent access token before writing our script.

Read more...

How to auto publish post on Facebook Fan Page as admin using Facebook PHP SDK V4 Graph API v2.2 (1) - Create Facebook App

Facebook Developers

I have a ecommerce webstore using OpenCart with around 2000 products. I have created twitter, goolge+, pinterest and facebook page for this website. With Google moving towards semantic search, all these social signals are increasingly important. Everyday, I have to select one or two products and post on these social media websites. I don't like to do this manually and need to find ways to let a script doing the posting. For facebook page, I can use the APIs for daily auto posting. 

When I search for "Facebook PHP SDK for auto posting", there are lots of confusing results, mainly due to code written in different Facebook SDK versions. The documentation from Facebook developers website also quite difficult to understand. However, after reading articles from many websites, trial and error with example codes, finally I am able to do auto posting with Facebook PHP SDK v4 and Graph API v2.2.

Here's my expectations:

1) I want to auto post to Facebook Fan Page as admin.

2) No Login with Facebook required.

3) I want to post product title, product url, and product image, just like normal posting.

4) With 2000++ product listing and still increasing, I want able to auto post three to five items per day. It will take more that one year to post all products from my website. 

In part 1 of this article, I will describe how I set up facebook app. In the next article, I will describe the code used to auto post single item to facebook page.

Read more...

Price Comparison Engine using SphinxSearch

Sphinx Search Server

My last article was written in Feb this year. I was extremely busy on my internet retail business in the last eight months, as well as putting a lot of effort to release a price comparison engine for Malaysia's ecommerce market, BijakMall.com. Bijak Mall collects product information from various internet malls in Malaysia and indexes into database. Potential buyers then able to search and compare prices through Bijak Mall's search engine. I started this project under XAMPP environment so that I can use my laptop to test the scripts. I wrote web spiders in PHP to collect products' data and store into MySQL database. Sphinx Search Server (Windows version) is used to index the database and response to user query. Sphinx is a free software/open source Fulltext search engine designed to provide full-text search functionality to client applications.

Read more...

Facebook Remote Status Update with PHP/cURL Bot

Facebook login form

** The script in this post no longer working properly. I have updated the script and posted at New and Updated! Facebook Remote Status Update with PHP/cURL Bot  **

 

In previous posts, the discussion is mainly focus on getting web page source file and perform scraping to the text file. In this post, I will show you how I use PHP/cURL to login into Facebook account and post status update at Facebook wall. Once we know how to remote posting using PHP/cURL, we can do many things such as auto posting comment into forum or blog, fill up contact form and send email to our target, login to website and pull out information we need.

For this test case, I am using XAMPP in the PC and login to Facebook mobile interface which is much simpler than its desktop version. You can try to compare Facebook login source file for both desktop and mobile version.

From the browser, go to http://m.facebook.com and there is only one login form.

Read more...

The Art of Invisibility for Ninja Web Spider

Art of Invisibility

Most webmasters welcome webbot from search engines such as Google and Bing. Contents of their websites will be indexed by search engines and users can easily find thier websites. However, they surely not welcome your web spider to extract data from their sites, may be you're up to no good, such as how products from etsy.com were used to drive traffic to a Hong Kong website. Most likely your IP will be blocked if webmasters detect unknown web spider aggressively crawling their websites. Way back to 2001, eBay filed legal action against Bidder's Edge, an auction scraping site, for “deep linking” into its listings and bombarding its service; Craigslist has throttle mechanism to prevent web crawlers from overwhelming the site with requests

Even big spiders like Google has mechanism to prevent others from scraping their content. Try to search a keyword and at the results page, click page 1, then page 2, page 3,...At page 20 (in my case), Google will stop displaying search results and want to find out are you a human or webbot that reading the page. If you unable enter correct captcha, then your IP will be blocked eventually.

Read more...

Must have Component - Xmap - Dynamic Sitemap Generator for Joomla!

Joomla Xmap Logo

Search engines like Google and Bing want us to submit sitemap file in webmaster tools so that they can use it to crawl and analyze our website. If you still have pure HTML based website during 90's and early 00's, most likely you have to run sitemap generator software from PC, crawling and collecting urls from your web pages and save into a sitemap file. The major problem is you need to generate new sitemap file every now and then when you update your website, then upload to webmaster tools.

With Joomla, you can use Xmap, a free component created by Guillermo Vargas since Joomla 1.x, to re-generate dynamic sitemap automatically so that you can totally forget about sitemap file after initial setup. Xmap creates sitemap based on the structure of menus in your Joomla website. You can add or remove menus anytime and Xmap will dynamically generate sitemap accordingly with additional metadata. You can also create any number of sitemaps with different options. However, Xmap also comes with poor documentation and poor support, as you can see from JED comments. If you encounter any problems, you need to search the solution from forums, just like Xmap was not functional in recent Joomla 3.2 upgrade.

Read more...

Download and Save Images with PHP/cURL Web Scraper Script

extract image5

In this article, I will discuss how to download and save image files with PHP/cURL web scraper. I will use email extractor script created earlier as example. With some modification, the same script can then be used to extract product information and images from Internet shopping websites such as ebay.com or amazon.com to your desired database. We can also extract business information from directory websites, both text information and images into your website as well.

There are concerns and considerations before we scrape image file from websites.

1) There could be various file formats (jpeg, png, gif etc) used in a website. Even a single web page could have various file formats. 

If we want to build common database for all collected images (from various websites), then our PHP web scraper script needs to be able to convert to the file format we prefer.

2) Each images could have different file size.

Some images can be very large and some very small. Our PHP web scraping script needs to be able to resize large file to a smaller size. Resize large file to small is not a problem. Small size to large will give us poor quality image.

3) We need a naming convention for image file.

Different websites named image files differently. Some have long name, some short. Before store image files into our folder, we need to rename these files with our naming convention.

4) We need to add one column in MySQL database, to link the images to the related information.

So here we go...

Read more...
Subscribe to this RSS feed