Saturday, February 21, 2009

PHP - Enable cURL in Windows.


PHP has build-in extension for getting content from remote sites. This extension is known as cURL.

For enabling cURL in windows environment, we need to do below two steps.
-Uncomment extension=php_curl.dll in the PHP.INI file.
-Copy the below two dll files from PHP folder and put it under windows/system32 dir.
ssleay32.dll
libeay32.dll
More Articles...

Read more ...

Saturday, February 14, 2009

Excel - Removing duplicate values


We can remove duplicate entries in a list using MS-Excel.

Click Data->Filter->Advanced Filter in the Excel sheet.

Check the 'Unique records only' checkbox in Advanced filter Dialog box, to remove the duplicate entries in the list.
More Articles...

Read more ...

Thursday, February 12, 2009

Javascript- Accessing DOM before completion of page load


Normally we will use window.onload event to find completion of web page loading.
But if the page is having large images then we need not wait till completing images, for accessing DOM.
Refer this blog for more details.
More Articles...

Read more ...

Wednesday, February 11, 2009

C# - Handling symbols in webbrowser control


We will be facing issues with WebBrowser control in C#.net for handling symbols such as Ø,½ and ±. We may face issues even when we set appropriate character encoding for the webbrowser control. It seems documenttext property is not handling symbols properly. To overcome this issue, we can get the header and body details separately instead of using documenttext.

The code can be written as below,
string strhead = webBrowser1.Document.GetElementsByTagName("head")[0].OuterHtml;
string strbody = webBrowser1.Document.Body.OuterHtml;
string strhtml = strhead + strbody;
More Articles...

Read more ...

Tuesday, February 10, 2009

Cross browser web development.


The important challenge for any web developer is making the website work in many different types of browsers(IE, FireFox, Safari, and Chrome) and also in different browser versions.
To achieve this, we may need to write separate code for each browser if same code behaves differently in each browser, and need to use the respecive code based on the browser type.

In server side coding, it can be done using USER_AGENT. In client side coding it can be handled using 'navigator.appName' in javascript.

More Articles...

Read more ...

Monday, February 9, 2009

Online Bookmark


We are having Bookmark (FireFox)/Favorites (IE) in our browsers to store URLs of frequently used websites.
But it will be useful only when you are using same machine for all your work.
(i-e it will be stored in your local machine only).

To overcome this problem, many online bookmarking sites (e.g spurl.net)are available.
- They will allow you access your bookmark/favorites from any machine connected to Internet.
- They will suggest you the URLs which are bookmarked by many people.
- Mostly they will be free.(But anyway they will be getting income by selling the browsing pattern of users to the Marketing/Research people).
More Articles...

Read more ...

Friday, February 6, 2009

MySql version issue.


We may face issue if we try to use database created by one version of MySql server in another version of Mysql server.

In this case, we can export the structure and data as sql query using phpmyadmin from source server. And then, we can import this sql query to the destination server.
More Articles...

Read more ...

Wednesday, February 4, 2009

MySQL - Importing Large sql file


It is difficult to import schema and data from large size sql file using phpymyadmin. Because it will fail due to memory limitation or time out setting.

In this case we use commandline feature for mysql to process the large sql file.

We can use like below
mysql> SOURCE /path/to/file.sql;

For doing this we should have access to the mysql server. But practically it is not possible to have direct access to mysql server all times Because most of the servers will be available in remote machine only.
So I am looking for any feasible solution to achieve it thro' web UI itself (i-e similar to phpmyadmin). If anyone knows any appropriate solution you can share it here thro' comments.

You can refer this for more details.
More Articles...
Read more ...

Bookmarklet


Bookmarklet is a javascript code which can sit in the Favarite/Bookmark bar of your browser.

It can act on any web page you are viewing on your browser to give comfortable surfing.

We have created one bookmarklet for our client which can extract all images from the currenly displaying webpage and we can send those images to our server.

For more details about this bookmarklet project you can refer our website.

Another bookmarklet will allow the user to do search after completing customized algorithm.
You can contact us (qualitypointmail@gmail.com) for any of your software/web development needs.

You can see some sample bookmarklets here
More Articles...

Read more ...

Our vision and Values



Our Vision
To be the Global Technical Leader by using latest technologies effectively and also by introducing innovative solutions while keeping Employees and Users happy.
Our Values
Honest and Hardwork
Delivering Quality Product on time cost effectively.
More Articles...

Read more ...

Tuesday, February 3, 2009

Google Analytics


Google Analytics is a free feature provided by Google.
It will give statistics of the webpage in nice way.

- We can see even how much time the visitor spent on particular page.
- We can get details about browser/net connection used for seeing the page.
- It will show whether the visitor is a new visitor or repeated visitor.
- Setting up google analytics is very simple. You can use your existing gmail account and get a analytic code. Once after putting this code in you webpage, you can see the statics about your page.
-This tool will be more useful in SEO marketing.


You can see more details at http://www.google.com/analytics/
More Articles...

Read more ...

Monday, February 2, 2009

Offline Crawling - Resolved the issue with webbrowser control in C#


It seems Webbrowser control in C# is more useful for scraping contents from websites and also for doing autologin and autoposting of contents.

Offline scraping will save more time. i-e In offline scraping, we crawl the website onetime and download all the webpages onetime. We can scrap the required contents from the downloaded pages at any number of times without worrying about network constraints.

But we have faced many issues in using webbrowser control for offline scraping.
After searching the net, I came to know that many people are facing similar issue.
Please find below the code which will solve the issue.

string partHtmlpage = "Your html page"
webBrowser1.AllowNavigation = true;
if (webBrowser1.Document != null)
{
webBrowser1.Document.OpenNew(true);

}
else
{
webBrowser1.Navigate("about:blank");


}
webBrowser1.Document.Write(partHtmlpage);
More Articles...

Read more ...

Issue with GD Library


It seems there is some issue with GD library built-in in php.

Please refer http://bugs.php.net/bug.php?id=42522.

To resolve this issue we have used imagefilledarc instead of imagefill in our PieChart program.
More Articles...

Read more ...

Search This Blog