Wednesday 14 September 2011

WHAT IS JUJU?

cloud.png
The emergence of cloud computing has reduced the task of server provisioning down to minutes, from days. Juju is the next evolutionary step. It reduces the task of provisioning and control of service applications in the cloud down to seconds, from hours.
Before the cloud, deploying interconnected services across multiple servers required days, if not weeks. One had to procure the necessary hardware, find lab space for them, physically set them up, install the OS and required applications, and then configure and connect the various applications on each machine to provide the right desired services. Once the entire solution was deployed, upgrading or replacing the service applications, modifying the connections between them, scaling out to account for higher load, and/or writing custom scripts for re-deployment elsewhere all required even more time.
Eventually, deployment tools, like cobbler and FAI, evolved to reduce the time and complexity involved in installing the OS on each machine. Then tools, like Puppet and Chef, arrived to do the same for system configuration. Finally, with the arrival of cloud computing, specifically public cloud computing, the burdens associated with hardware acquisition and setup went away. While all tremendously valuable technologies, none of these advancements really solve the complexities of orchestrating your services, i.e. deploying, connecting, and controlling your service applications across all systems. Automatic service orchestration built for the inherent elasticity of the cloud did not exist, and manual service orchestration in the cloud takes time...and in the cloud, wasted time is wasted money.

Saturday 13 August 2011

Talk in ILUGC

One of the nice experience i had in Indian linux user group,chennai .I gave talk in Ilugc meet for the first time regarding ENTERPRISE ASSET MANAGEMENT TOOL.

Description:
calem EAM [ http://www.calemeam.com ] is an open source tool. It is a Enterprise Asset Management which can be used for the industries for their manufacturing of their Products. Enterprise asset management is super set of CMMS(Computerized maintenance management system).

Thanks for Ilugc co-ordinator(Mr.Shrini)  to give this nice opportunity to deliver my speech.

Installing Java in Ubutnu

Installing Java in Ubuntu made easy :)
1. from synaptic select
sun-java6-jdk
and apply. which will automatically select other required package like:
sun-java6-jre
sun-java6-bin

once sun-java6-jdk has been installed you don’t need to worry about a thing to install

2. you have to configure default java to be sun’s
from your terminal type “sudo update-alternatives –config java”
you will get number of alternatives as shown below.. select “2″

Selection Path Priority Status
————————————————————
0 /usr/lib/jvm/java-6-openjdk/jre/bin/java 1061 auto mode
1 /usr/lib/jvm/java-6-openjdk/jre/bin/java 1061manual mode
* 2 /usr/lib/jvm/java-6-sun/jre/bin/java 63 manual mode

3. you are done.. :)
you can now write your java program in any editor or IDE and then execute your java program
as
javac program.java (which compiles your java program)
java program (which will execute your java program )

HAPPY PROGRAMMING!
JAVA IS FUN

Thursday 11 August 2011

Enterprise Asset Management (An open source tool)

Enterprise asset management (EAM)means the complete life cycle of optimal management of the physical assets of an organization to maximum value. It covers such as design, construction, commissioning, operations, maintenance and decommissioning/replacement of plant, equipment and facilities. "Enterprise" refers to the management of the assets across departments, locations, facilities and, in some cases, business units. By managing assets across by this enterprise asset management, organizations can improve utilization and performance, reduce capital cost, reduce asset-related operating cost, extend asset life and subsequently improve return on assets The functions of asset management are taking a fundamental life planning, life cycle costing, planned and proactive maintenance and other industry best practices. Some companies still regard physical asset management as just a more business-focused term for maintenance management - until they begin to realize the organization-wide impact and interdependency's with operations, design, asset performance, personnel productivity and life cycle costs. This topic focus the progression from maintenance management to Enterprise Asset Management By providing a platform for connecting people, processes, assets, industry-based knowledge and decision support capabilities based on quality information,EAM provides a holistic view of an organization's asset base, enabling managers to control and optimize their operations for quality and efficiency. This EAM works by computerised maintenance management system A CMMS software package is computer maintenance management system contains information about an organization’s maintenance operations.

Sunday 3 July 2011

ubuntu comes to samsung galaxy

Samsung Galaxy S is still on the list of best Android phones and it keeps getting better with all the love and care developers, hackers and modders are showing it. Recently, an XDA member Armin Coralici was successful in installing Ubuntu on the Galaxy S.
There is a considerable speed lag with Ubuntu but it works perfectly well. The installation was made by creating a chroot environment for ARM. This is probably because chroot is available only for the x86 and x64 architectures. The version of Ubuntu used here is a stripped down versio.

Thursday 2 June 2011

BUILDING THE CLOUD WITH OPEN SOURCE



Open source software and standards are not just beneficial, but highly essential for a heterogeneous, shared and scalable environment such as the ‘cloud’. What’s more, the community has promptly readied the tools needed to meet this emerging trend. It is not surprising, then, that evangelists believe that open source has built the cloud…
We certainly do not need to tell you what free/open-source software is, but probably should spend a few minutes to clarify what cloud computing really is. Take any definition of cloud computing, and it sounds so similar to software-as-a-service, utility computing, and even grid computing, to boot. It takes quite a while to figure out the difference, which is why it is best explained to you right away, so you can appreciate the cloud computing concept and the open source advantage even better.
Hey diddle diddle…
If the cow jumped over the moon today, what would it see on the clouds below? It would see software-as-a-service (SaaS), platform-as-a-service, utility computing, managed service providers, Web services, and cloud integrators. Well, that is what it is. Cloud computing is not a new technology; it is merely a new concept that integrates many virtualisation and pay-as-you-go models that already existed.

Software-as-a-service enables a user to rent and run software applications from service providers who maintain and manage the application on their servers. Remember Salesforce.com. The user simply has to rent an app, use it over the Web, and pay as per usage.
Utility computing, similarly, enables users to rent infrastructure, such as storage or servers, and use it over the Web. Companies use such services to cater to temporary surges in requirements. Remember IBM, Sun and Amazon.com.
Web services enable developers to connect or fit functional blocks or application programming interfaces (APIs) offered over the Web into their own applications, so as to not reinvent the wheel. Remember Google Maps and Xignite.
On-demand platforms enable users to string together whatever applications they need from a service provider into a seamless solution, and use it over the Internet. This is somewhat like an extension of SaaS, but here the user picks and uses a group of applications, rather than just one. The choice of applications is, however, limited by whatever the service provider has on offer. Remember Force.com, Coghead, and Google AppEngine.
Managed services is also a similar concept; it’s just that the user is offered a service—such as network security or backup—over the Web, instead of just an application. Remember IBM, Symantec and Verizon.
Put all this together, and you have cloud computing, which is a very broad term that covers a range of resources and services offered over the Internet. Some experts opine that any resource that a company uses over the Internet, outside its firewall, is ‘on the cloud’. So, a company can choose, customise or develop an enterprise-wide solution, manage and maintain it, scale it up or down, or do whatever it wants, completely over the Internet—oh, sorry, we should now be saying “on the cloud”, but it means the same thing anyway. The cloud is nothing but a metaphor for the Internet, if that helps ease the confusion in any way!
the cat and the fiddle
The biggest advantage of cloud computing is that the end-user or developer does not have to bother about the physical location or configuration of the actual resources. The service providers will worry about all that, abstracting all the dirty details from the developers, who can work completely on a logical plane.

These systems can be easily scaled up or scaled down. You can start with a small server and a little storage during development, and then scale up the volume or features of the system on-the-go, as the usage grows. Or, a company can use cloud resources simply to cater to temporary needs. They could even opt to use a combination of their own and cloud resources. In short, they can use what they want, and pay just for what they use.
Almost all literature on this subject compares the cloud to the electricity grid. You consume electricity without bothering about where it is generated, how far away or along which route it travelled to get to you. Nor do such details as how many others are using power from that grid, bother you. Similarly, you can use software, services and infrastructure from the service providers using a Web interface, without worrying about any back-end details. Examples of cloud-computing platforms include Amazon’s Elastic Cloud Compute (EC2), Salesforce’s Force.com and Microsoft’s Azure.
The abstraction provided by cloud computing is so beneficial, from a user’s perspective, that some large organisations are cloaking even their existing infrastructure as ‘private clouds’. What this means is that they consolidate all their resources and deliver it to various user units within the company, just as cloud service providers would do to the public. So, the users within the company can use the resources they need without worrying about where they are, or how they are managed. To the company, it offers a two-fold benefit—user convenience, as well as better resource utilisation. No resource gets locked up. It is all on the private cloud, can be managed centrally, and provisioned as needed within the company. Plus, the life of resources also gets extended. An old, slow and steady computer can still be used, as long as it can connect to the network—it can run all the needed applications off the cloud!
“While the benefits of cloud computing are compelling, as with any product or technology, adoption takes its own course. Indian organisations, like many of their counterparts across the world, are in evaluation mode. They have largely moved away from the understanding mode, but adoption is not yet high. I think India presents an immense opportunity to realise the cloud possibilities. In developed countries, we see organisations looking at cloud adoption in stages: first, adopting infrastructure virtualisation; then, setting up a private cloud; this is followed by selective use of the public cloud; before moving a large part of the application portfolio to the cloud (private or public). For Indian organisations, there is a big opportunity for them to leapfrog to the cloud faster, given their under-investment in information technology (IT). So they don’t have to worry about managing legacy investments, creating an expensive migration plan, and then adopting the cloud,” says Dhiraj Sinha, leader of the Applications Technology group, Dell Services. Many of Dell’s solutions have used open source—for example, Dell is working with Canonical to help customers adopt Ubuntu-powered open source clouds, and the high-performance analytics services of Dell use Hadoop, the open source distributed computing and data-storage framework.
Open source has built the cloud”
Sharing, the freedom to mix and match, choice, and many other characteristics of cloud computing demonstrate a likeness to the principles of open source software. More than just the similarities, there is also a sheer necessity for open source software and open standards in clouds that comprise heterogeneous, and often proprietary, infrastructure. The availability of source code, the freedom to modify and redistribute, the flexibility and constant evolution, and other open philosophies greatly favour the cloud schema of things. A cloud is, after all, a fluffy and lovable structure. Would we not hate a boxed cloud?

Undoubtedly, open source lies at the foundation of many of the earliest cloud implementations. “Open source has built the cloud. When we think of the services we consume on the cloud, from Facebook and Google to Amazon, none would have been affordable or scalable using a traditional licensing model,” says Prakash Advani, partner manager—Central Asia, Ubuntu. “Moving that capability from the leading edge of SaaS provision and into the mainstream enterprise is the next big opportunity for open source.”
The fact that most of the public clouds, including Amazon’s Elastic Compute Cloud (EC2), run on Linux-based platforms is just one facet of the story. In fact, that is now taken so much for granted that the focus has shifted to tools and platforms that enable the building of private or hybrid clouds, the integration of legacy infrastructure with the cloud, and so on.
A rather large number of such open source tools are now available, right from platforms and development tools, to management dashboards and automated migration tools for applications. Existing open source platforms are also fast adapting to the needs of cloud computing and include features such as intelligent workload management and cloud-enabled scalability considerations, to help massive horizontal scalability at all the layers of the technology stack. Ubuntu, Red Hat, and almost every other open source platform now has a stable cloud offering.
George Paul, executive vice president, HCL Infosystems, quickly justifies our point with some examples: “An open source software-infrastructure project called Eucalyptus imitates the experience of using Amazon’s EC2, but allows users to run programs on their own resources. The University of Chicago’s Nimbus is another open source cloud-computing project that is widely recognised as having pioneered the field. Today, customers have a large choice of open source applications for the cloud, including Red Hat, Traffic Server, Puppet, Zoho, Cloudera, Enomaly and Joyent.”
He goes on to explain that apart from new tools, existing open source offerings are being made cloud-enabled through standardisation. “A common standard called the Application Packaging Standard (APS), an open standard with all specifications, has been introduced; it helps in making applications multi-tenant. It consists of over 250 applications, and will continue to grow in the future as well,” he says. APS helps to standardise the packaging, automate the provisioning and management, and to integrate with other hosted services.
HCL’s O’zone, a cloud-enabled services suite, combines a variety of open source solutions, including the Proxmox open source virtualisation platform, Red Hat Enterprise Virtualisation, and open source enterprise resource planning, customer relationship management and content management systems.
Open source for the cloud
Virtualisation of infrastructure is at the heart of cloud computing. Open source options, such as the Kernel-based Virtual Machine (KVM) for Linux and the Xen hypervisor, are very competent. Other open source, Linux-native tools like Hadoop, Cassandra, HipHop, CouchDB and Btrfs also assist one in building a first-class data centre, very cost-effectively. These could be seen as the starting point for the coming wave of enterprise-scale open source adoption on the cloud.

Software platforms such as Eucalyptus, which enable the implementation of private and hybrid clouds, are also becoming very popular. Eucalyptus is a modular platform that is capable of working with a variety of interfaces, including Amazon’s EC2 and Simple Storage Service (S3) services. Eucalyptus works with various distros, including Red Hat Enterprise Linux (RHEL), CentOS, SUSE Linux Enterprise Server (SLES), OpenSUSE, Debian and Fedora. It can also host MS Windows images. It is capable of working with many virtualisation technologies such as VMware, Xen and KVM hypervisors, in order to implement the abstraction demanded by a cloud environment.
Solutions such as Cloudera build on the capabilities of popular open source options like Hadoop, to meet an enterprise’s cloud-computing needs. The Cloudera data management platform incorporates the Hadoop Distributed File System (HDFS), Hadoop MapReduce, Hive, Pig, HBase, Sqoop, Flume, Oozie, Zookeeper and Hue, and is available free under an Apache licence. The enterprise package includes support, tools and training.
Joyent’s SmartPlatform is an open source, server-side JavaScript-based framework for developing and delivering real-time, asynchronous Web applications to the cloud. It is basically a platform-as-a-service. While hosting is free at the moment, it might become a paid service once SmartPlatform graduates from the beta to a stable release. People are betting big on Joyent’s offering, because of the ubiquity of Javascript.
Enomaly’s open source cloud-management and provisioning software, the Elastic Computing Platform or ECP, allows an enterprise to create a private cloud inside its own data centres. It can also link the private cloud to a public one, as and when the company suddenly needs more computing resources. The tool-kit can also be used by service providers and telecom firms to quickly set up and deliver infrastructure-as-a-service (IaaS) cloud-computing services to customers. ECP includes security and compliance features, and Enomaly also offers service and support to licence holders, in a model very similar to that followed by companies like Red Hat. RightScale, Elastra and 3Tera are other similar offerings.
Nimbus is another notable tool-kit, maintained by the University of Chicago, which enables you to swiftly convert your cluster into an IaaS cloud. Currently, Nimbus is deployed into a Globus 4.0.x Java container, a system built around the Apache Axis engine. It supports three sets of remote interfaces: the Amazon EC2 Web services definition/description language (WSDL), the Amazon EC2 Query API, and the grid community Web services resource framework (WSRF). It also manages the security for these interfaces. The storage implementation is compatible with the Amazon S3 REST API, and virtualisation is based on Xen and KVM. Nimbus is known for being highly configurable and extensible.
George Paul brings to our notice another interesting cloud-computing project, Reservoir, funded by the European Union and coordinated by IBM. Last September, the group released its cloud stack featuring the Claudia Service Manager, a tool for automatic management of service scalability, and the OpenNebula Cloud Toolkit. OpenNebula is an open source cloud-computing tool-kit capable of managing several thousand virtual machines, along with large storage and networks. It supports all common cloud interfaces, and can fit into any existing data centre to help build a private, public or hybrid cloud.
Some of the other notable open source tools that are very useful in a cloud environment include Apache’s Traffic Server (a fast, scalable and extensible HTTP/1.1 compliant caching-proxy server), and Puppet (a configuration management tool written in Ruby and released under the General Public License or GPL).
Interestingly, Microsoft has also been captioning open source strengths amongst the benefits of Azure, its cloud-computing platform. The Windows Azure software development kit for PHP and the Windows Azure Tools for Eclipse make it easy for programmers to deploy their PHP applications to the Azure cloud, not to forget the support for command-line developers to leverage scripting skills in the deployment of existing PHP applications. Then there is the Windows Azure Companion, which apparently eases the task of deploying open source community applications such as WordPress, SugarCRM, Drupal, etc, onto Windows Azure. Microsoft has also been working with many open source developers to put more and more FOSS programs on Azure.
The focus on open standards
It is clear that there are tons of open source software for cloud computing; you just need to make sure you pick the right ones. Remember, it is not just about open source software; you need to ensure that your choice is based on open standards too. That is precisely what the OpenStack project and consortium are all about.

OpenStack is a collection of open source technologies that deliver a highly-scalable cloud operating system, based completely on open standards. OpenStack has two interrelated projects: OpenStack Compute and OpenStack Object Storage. Compute is for provisioning and managing large groups of virtual private servers, while Object Storage is for creating redundant, scalable object storage using clusters of commodity servers. Object Storage can handle even petabytes of data! OpenStack integrates code from NASA’s Nebula platform as well as Rackspace’s Cloud Files platform, and is released under the Apache 2.0 licence. OpenStack’s forte is large-scale computational prowess, like those needed for DNA modelling, space research, and the likes.
More than anything else, OpenStack.org is a great example of how industry players are converging to set open standards in place to build clouds. OpenStack is backed by big names such as Rackspace, NASA, Dell, Citrix, Cisco and Canonical, not to forget the large global open source community. The team feels that an open development model is the only way to foster badly-needed cloud standards, remove the fear of proprietary lock-in for cloud customers, and create a large ecosystem that spans cloud providers.
“It is important for customers to have choices and not be tied down to a proprietary platform. Open standards for the cloud will play a key role in enhancing cloud adoption. When I say open standards, I don’t necessarily imply open source, but the fact that customers will have choices and the ability to stitch together solutions that meet their needs. Open standards and specifications would ensure that different products and tools can co-exist to deliver the cloud. Open source will be a key enabler to providing that choice. Such initiatives are already under way—one of them being the OpenStack consortium, in which Dell is a key participant. We believe such initiatives, and the existing open source ecosystem, will evolve cloud computing to the next stage,” says Sinha.
Open source, the best for India
There are loads of open source solutions for cloud computing out there. New ones, old ones, modified or extended ones, et al. Cloud computing being a nascent space, there is still lots more for the community to do. Sinha says, “Cloud computing throws up new problems to tackle, and new possibilities to address through the open source community. While some of the existing OSS contributions would be a great natural extension to the cloud world, there are several areas in the cloud-computing puzzle that require newer open source offerings.”

Of course, open source software and the community will continue to do its bit—and more—for cloud computing. “The ongoing impact of open source, the whole concept of SaaS meeting the cloud, is going to be a major trend in India. The open source approach in cloud computing will definitely grow in the future. Different organisations with different requirements can customise the applications according to their own use. New and advanced components, which are more flexible, transparent and cost-effective, can be integrated with the applications at any point of time, enhancing the capabilities of the system,” says Paul.
Advani sums up: “The availability and cost benefit of open source allows India to build cloud solutions for India. Service providers can adopt Ubuntu Enterprise Cloud, for instance, and provide a local, world-class scalable cloud provision for local businesses. The risk of a cloud in the proprietary world is reliance on a single vendor, external to the country, with service-level agreements that are liable to change. Widespread adoption of an open cloud solution is a much better route to avoid this risk. Of course, the open source cloud has all the advantages: it’s cost-effective, there’s no vendor lock-in, and you have access to the code, so you are not worried about getting stuck with the wrong technology!” That, after all, is very important for nascent technologies like the cloud.

Tuesday 31 May 2011

THOUGHTWORKS-RELAY ON OPEN SOURCE

Open Source Software at ThoughtWorks

ThoughtWorks believes in Open Source Software and supports it in a number of ways. At ThoughtWorks our innovation network and the technical communities provide an environment for developers to spend dedicated company time on their open source initiatives. ThoughtWorks Studios is not only about commercial products but also about open source software, developing and supporting solutions such as CruiseControl and RubyWorks. In exceptional cases we have hired open source committers to allow them to focus on important open source projects as part of their day-to-day work. Our legal department gives assistance to projects by helping them with licensing and intellectual property issues. Finally, ThoughtWorks sponsors regular meetings by different groups in the open source community.
Where possible ThoughtWorks promotes the use of open source software on development projects. From our experience innovative approaches and pragmatic solutions for a particular technical problem are often created in the context of a larger business-driven development effort by practitioners in the field. When released as open source software these solutions get generalised and become part of the toolchest for the entire community. ThoughtWorks selects and benefits from a huge number of great open source projects and at the same time strives to give back to the open source community by releasing and supporting several open source projects. Driven by a passion for technology and software many ThoughtWorkers have also started new projects and/or contributed to existing open source projects.

Monday 30 May 2011

MiFi Devices will replace 3G USB Data Cards

MiFi Device
MiFi devices are Portal WiFi Routers to share internet connection from 3G/GPRS/EDGE/CDMA/EV-DO and other mobile internet/broadband technologies.
Scenario
In Addition to the Broadband Internet Connection at Home and the GPRS/EDGE/3G connection in the mobile phone, People also have a 3rd requirement for Mobile Internet Access from Laptops/Tablets. There is a need for Internet Access Device for Internet Access from Laptops and Tablets during travel. Internet Tethering from the Mobile Phone(3G/EV-DO) is an obvious option, but it is practically impossible because it trades-off a very valuable resource during your travel which is battery life(which is required for making Phone Calls). Hence 3G USB Data Cards have became a new line of devices for Internet On the GO. Every Telecom Operator in India has a 3G datacard in their product line.
USB Data Cards are not solving the problem rite !!
The 3rd required Personal Internet Device is the USB Data Card today and it is not the right 3rd device in many ways.
  • USB itself is a problem: This device is to access Internet away from Home and Office in Laptop/Tablet/iPod & other Internet Devices(sometimes). The existing 3G data cards that are sold by the Operator connects to the device via the USB port. USB ports are a commodity interface only in laptops. Tablet computers, iPods, Portable Gaming Consoles, etc only do have WiFi as a commodity interface.
  • Does not support more than a Device at a time: The USB data cards provides out of the box internet access only to the device to which it is connected to. Today there is necessity for Internet access from multiple devices at the same time and sometimes even multiple people(eg: In-Car Internet).
  • Tied to a single Operator: The devices that the operators sell are mostly tied to a single operator. Indians are used to Unlocked mobile Phones and we always appreciate a device that supports all operators(like the mobile phones).
How MiFi Devices are solving it rite !!
  • WiFi: This is the most common interface for Internet access in laptops/tablets/iPod/Portable Gaming Console/Mobile Phones(sometimes).
  • Multiple Devices at a time: It has out of the box support for multiple devices.
Products Available in the Market Today(India)
Olive Telecom is the cheapest available in India INR 3500


MicroMax sells one at INR 4900
Operators also sell directly
Reliance INR 8000
Vodafone INR 5500

Saturday 28 May 2011

OPENSOURCE MISSION

Open source is a development method for software that harnesses the power of distributed peer review and transparency of process. The promise of open source is better quality, higher reliability, more flexibility, lower cost, and an end to predatory vendor lock-in.
The Open Source Initiative (OSI) is a non-profit corporation with global scope formed to educate about and advocate for the benefits of open source and to build bridges among different constituencies in the open source community.
One of our most important activities is as a standards body, maintaining the Open Source Definition for the good of the community. The Open Source Initiative Approved License trademark and program creates a nexus of trust around which developers, users, corporations and governments can organize open source cooperation.

Thursday 26 May 2011

How to install DRUPAL 7 in ubuntu-steps


Drupal how to install guideThis guide describes all steps on how to install Drupal 7 on Ubuntu Linux using Apache web server and MySQL database. This guide assumes default installation of Apache web server and MySQL database.
These instructions can be also used by users using Debian or any other Debian based Linux systems.
All commands in below are executed with root privileges. If you are sudo user prefix all commands below with sudo.
Step 1: Prerequisites installation
# apt-get install php5-mysql apache2 \
mysql-server php5-gd

1. Step 2: Download and decompress Drupal install files

Download and extract all drupal files into /var/www/drupal directory:
# cd /var/www
Download and decompress Drupal 7 install files:
# wget http://ftp.drupal.org/files/projects/drupal-7.0.tar.gz
# tar xvf drupal-7.0.tar.gz
# mv drupal-7.0/ drupal
Change owner of all Drupal 7 installation files to www-data ( apache webserver ):
# chown -R www-data.www-data /var/www/drupal/

2. Step 3: Configuring apache for Drupal

Create apache config file for a drupal website:
# cd /etc/apache2/sites-available
# sed 's/www/www\/drupal/g' default > drupal
Enable drupal site, disable default site and restart apache webserver:
# a2ensite drupal
# a2dissite default
# /etc/init.d/apache2 restart

3. Step 4: Create MySQL database for Drupal installation

In this step we will create a MySQL database to be used by our Drupal 7 website. By now you should have a MySQL server installed on your system as well as you should have a root password to access MySQL command line interface. Let's create:
  • Database: drupal7
  • User: drupal7
  • Password: drupal7-pass
# mysql -p
Enter password: 
mysql> create database drupal7;
Query OK, 1 row affected (0.00 sec)
 
mysql> CREATE USER 'drupal7'@'localhost' IDENTIFIED BY 'drupal7-pass';
Query OK, 0 rows affected (0.00 sec)
 
mysql> grant all privileges on drupal7.* to drupal7@localhost;
Query OK, 0 rows affected (0.00 sec)
 
mysql> quit
Bye

4. Step 5: Drupal 7 install

Everything should now be ready for actual Drupal 7 installation. From now on the installation using drupal7 installer is rather self explanatory. Navigate your browser to Apache's hostname or IP address and follow drupal 7 installer to complete drupal installation.