.:: Jasa Membuat Aplikasi Website,Desktop,Android Order Now..!! | | Order Now..!! Jasa Membuat Project Arduino,Robotic,Print 3D ::.

WHAT IS HYIP

0 komentar
HYIP = High Yield Investment Program
HYIP adalah program (sebagian besar online) yang menawarkan keuntungan besar dalam tempo yang relatif cepat. Banyak program yang menawarkan sampai lebih dari 100% keuntungan tiap bulannya. Dengan minimal investasi $1, $5 atau $10 maka HYIP menjadi pilihan investasi yang menjanjikan.
Program-program HYIP baru banyak bermunculan, dan ribuan/jutaan orang diseluruh dunia berusaha mendapat keuntungan besar dari program ini. Satu Program HYIP dapat dikelola oleh satu perusahaan atau kumpulan orang atau bahkan oleh seorang saja.
Kebanyakan program HYIP ditujukan bagi umum, tidak ada batasan negara, umur, pendidikan dll. Persyaratannya adalah mau berinvestasi. Biasanya minimal investasi yang dibutuhkan bervariasi mulai dari $1 saja. Hampir semua HYIP menggunakan e-gold sebagai cara menerima dan mengirim pembayaran.
Sebagian besar pengelola HYIP menjalankan bisnis di perdagangan berjangka, forex, saham, property, real estate dan lain-lain.

IndoChanger bukan tempat investasi, tapi merupakan pusat jual beli E-Gold, E-Bullion dan E-Currency lainnya, kalau anda belum memiliki E-Gold atau E-Bullion, anda bisa beli di sini. Saya sarankan IndoChanger karena selama ini saya cukup puas dengan layanan mereka yang telah berlangsung lebih dari 4 tahun di mulai sejak tahun Mei 2002. Kalau anda belum punya account E-Gold, anda bisa buat terlebih dahulu di sini. Setelah itu baru anda membeli E-Gold di IndoChanger. E-Gold & E-Bullion bisa digunakan untuk transaksi atau investasi di HYIP (High Yield Investment Program). Selain itu E-Gold & E-bullion juga merupakan investasi jangka menegah hingga panjang di mana sejak 2 tahun terakhir harga emas dunia telah mengalami lonjakan di atas 50%, sehingga bagi anda yang memiliki E-Gold dalam jumlah yang besar, jelas akan sangat menguntungkan karena nilai E-Gold berfluktuasi mengikuti naik turunnya harga emas dunia. Yang pastinya lebih menguntungkan dari pada anda menyimpan uang di Bank dengan bunga single digit.
Berikut beberapa jenis pengelolaan HYIP.
  1. Scam. Program ini hanya mengumpulkan investasi saja dan memberikan pembayaran pada investor sebagai rangsangan untuk investasi yang lebih besar. Begitu jumlah investasi besar, mereka menghilang...Maling.
  2. Pengelola HYIP benar-benar melakukan perdagangan forex, tetapi mereka bukan ahlinya sehingga "judi" saja. Yang seperti ini akan membuat HYIP yang mereka kelola cepat bangkrut.
  3. Pengelola HYIP menginvestasikan dana anggota ke HYIP lain.
  4. Pengelola HYIP benar-benar memiliki tim trading (manager investasi) yang bagus. HYIP jenis ini yang sangat diinginkan investor.
    Sayangnya, secara online sangatlah sulit untuk mengetahui jenis pengelolaan program HYIP. Meskipun demikian, kita masih dapat menarik keuntungan besar dengan beberapa cara.

Secara umum ada beberapa cara yang dapat dilakukan dalam bisnis HYIP.

Membagi Investasi. Telah kita ketahui bahwa banyak HYIP yang hidup dalam hitungan bulan saja maka membagi resiko dengan mengikuti beberapa program menjadi penting.

Jika kita bagi $100 dalam 10 program, dan ternyata 3 HYIP Scam dalam 1 bulan, 3 HYIP Scam dalam 3 bulan, 3 lagi dalam 6 bulan dan 1 berlangsung selama 1 tahun maka kita masih untung besar. Jika diasumsikan tiap program memberikan keuntungan 2,5% perhari (pembayaran perhari), keuntungan adalah -$7.5 (scam), $37.5 (3 bulan), $105 (6 bulan) dan $ 81.25 (1 thn). Total menghasilkan $216.25/tahun.

Jika anda memilih cara compounding maka total penghasilan anda dari 10 program adalah "kaya raya". Cobalah menghitung dengan sistem compounding (profit ditambahkan pada modal setiap hari).

Siapa Cepat Dia Dapat. Jika rata-rata setiap program berumur 3 bulan, tidak peduli cara pengelolaannya maka anda yang ikut di awal akan mendapat keuntungan! Pada saat sudah untung, investasikan saja keuntungan yang didapat.

Pelajari Sebelum Investasi. Apa yang harus anda ketahui sebelum berinvestasi?

  1. Apakah website program tersebut terlihat profesional? tidak menggunakan free hosting atau free domain.
  2. Apakah ada alamatnya atau kontak lain selain email? Jika ada ini akan memberi nilai lebih.
  3. Apa kata investor lain (yang telah bergabung). Ini bisa ditemukan di banyak forum.
  4. Rasional! Apakah menurut anda profit yang dijanjikan masuk akal? Meskipun dimungkinkan untuk mendapat keuntungan besar di perdagangan forex, tetapi sangat jarang trader yang dapat konsisten terus menerus mendapat keuntungan.
  5. Bagaimana sistem pembayaran (withdrawal). Otomatis langsung ke e-gold anda setiap hari? harus request withdraw?, Compounding semua atau sebagian, berapa fee penarikan? kapan dapat dilakukan dll. Pastikan anda mengerti cara kerja HYIP anda

Investasilah sebesar anda berani kehilangan. Apa artinya? begitu anda masuk HYIP, anda tidak berkuasa lagi atas uang tersebut. Anda tidak dapat mengadukan (complain) program ke pihak berwenang seperti polisi. FBI dan banyak lagi tidak dapat terlalu diharapkan.

Kenali Tipe anda sebagai Investor Dengan mengetahui tipe anda sebagai investor akan membuat anda lebih siap menghadapi resiko terhadap keputusan-keputusan investasi.

1. Tipe cerdas. Adalah anda yang memasuki bisnis HYIP dengan perhitungan dan strategi jangka panjang. Rugi dalam HYIP adalah Biasa, dan tipe ini akan menjadikan kerugian tersebut sebagai pelajaran untuk tetap meraup keuntungan di arena HYIP. Investor tipe ini adalah tipe yang paling beruntung.

2. Tipe Panik. Orang tipe ini biasanya sedang dalam kondisi keuangan yang sulit. Mungkin baru PHK, bangkrut, terlilit hutang dll. Mereka memimpikan menjadi jutawan dalam sekejap sehingga berani mengambil keputusan investasi nekat. Tipe ini akan menginvestasikan uang mereka dalam jumlah besar. Jika keputusan mereka salah maka habislah sudah!

3. Tipe untung-untungan.Bagi tipe ini, HYIP adalah seperti hiburan saja. Mereka mengetahui resiko dan mengambil keputusan atas dasar untung-untungan. Biasanya punya uang cukup sehingga tidak terlalu merasa kehilangan jika rugi. Tipe ini adalah orang-orang yang kecanduan HYIP.


Suni

Database Performance: The Web Layer

0 komentar

A database application is a like a convoy of ships, it is only as
fast as the the slowest ship. The three "ships" in a web-based database
application are the database itself, the web layer, and the browser.
Today we will continue our series on performance by examining how
the web layer can efficiently retrieve data from the database.




Welcome to the Database Programmer blog. This blog is for anybody who wants to see practical examples of how databases work and how to create lean and efficient database applications. There is a
"http://database-programmer.blogspot.com/2007/12/database-skills-complete-contents.html"
>Complete Table Of Contents
that is updated each week, and a
"http://database-programmer.blogspot.com/2008/01/table-design-patterns.html"
>Master list of table design patterns

that is updated whenever a new design pattern is presented.



Cost 1: Round Trips



The first basic cost of retrieving data is the "round trip".
Database programmers speak of a "round trip" as occurring whenever you
send a request the server and retrieve some results. Each round trip
to the server carries some overhead, as the server must do some basic
work to allocate and release resources at the start and end of the
request. This overhead is added to the base cost the server must pay
to actually go out to disk to find and retrieve your data.



If your application makes more round trips than are necessary, then
the program will be slower than it could be.




Cost 2: Retrieval Size



Every byte that the application retrieves from the server carries a cost
at several points. The server must go to disk and read it, the wire
must carry the load from the db server to the web server, and the web server
must hold the result in memory. If your web code regularly retrieves
more information than it needs, then the program will be slower
than it could be.



This is why you will see advice that you should never
use "SELECT *..." in direct queries, because it is near certain
that you are retrieving data you will not use. The more desirable
query names the exact columns you need so that you have maximum
efficiency. This is especially important if your table
contains text (aka clob) fields, if you use "SELECT *..." on one of
those tables you risk pulling all kinds of
data over the wire that is just going to be thrown away.



Example 1: One Round Trip By Using JOIN



Consider the basic case where you are retrieving and displaying the
line items from an order (or shopping cart as people call it these
days). Let us assume that you have an ORDER_LINES table that contains
SKU, QTY, and PRICE among others. The item's description is in the
ITEMS table. To display the lines, you must retrieve each line and
also retrieve the item's description.



To do this most efficiently, we can make a single round trip to the
server that retrieves all of the columns we need in one shot, then
do a loop to render them like so (the example is in PHP):




# Assume some function that gives you the order number,
# sanitized for safe substition
$order = GetOrderNumber();

# Form the SQL
$sq="SELECT ol.sku,ol.price,ol.qty,ol.extended_price
,i.description
FROM ORDER_LINES ol
JOIN ITEMS i ON ol.sku = i.sku
WHERE ol.oder = $order";

# Most frameworks should have some command to retrieve
# all rows for a query, something like this:
$lines = SQL_AllRows($sq);

# Finally, render the HTML
foreach($lines as $line) {
#
# HTML rendering code here
#
}


I should stress that this example carries a reasonable expectation
that the order is small enough that you don't start hitting the inefficiencies of your particular language. Rendering
large results sets in a Web Application is severely problematic compared
to the old desktop systems, and doing so requires separate techniques
that will have to wait for a future essay.




Example 2: Caching Small Tables



Sometimes you will need to generate a printed report that involves
many tables, including several description lookups. For instance,
I have a medical
application that generates statements for all patients who have
a balance. A typical run will produce 100 or 200 statements, and
each statement requires information from no less than 8 tables.



In cases like this you can simplify your queries by retrieving the
small lookup tables in their entirety before going to the
main query and loop. For the example of the medical program there are
two tables that qualify for this treatment. These are the tables of
"ICD9" codes and "CPT" codes. Both of these usually have only about
100 rows, and there are only 2 relevant columns in one and 3 in the
other. Therefore there is a big gain to be had by simply loading them
into RAM ahead of time and simplifying the resulting code.




This bare-bones example shows simply that the tables are loaded
first, and then main execution begins.




# The function SQL_Allrows() gives me the complete result from
# a query, the 2nd argument says to return an associative
# array with key values made out of the named column.
# NOTE: an "icd9" code is a medical diagnosis code
$icd9codes = SQL_AllRows(
"Select icd9code,description from icd9codes"
,"icd9code"
);

# NOTE: a "CPT" code is a medical procedure code
$cptcodes = SQL_AllRows(
"select cptcode,description from cptcodes"
,"cptcodes"
);

# ...now continue by pre-fetching the list of patients
# we will be dealing with, and then we can finally
# go into the main loop and refer to the $icd9codes
# and $cptcodes array as needed.
#
$patients = SQL_AllRows(
"Select patient from patients where balance > 0
order by last_name,first_name"
);
foreach($patients as $patient) {
#
# retrieve the statement information, use
# arrays $cptcodes and $icd9codes to display
# descriptions for those codes
#
}


Knowing Your Context



There is one more piece of the puzzle that a programmer must have if he is
to make wise decisions when trying to balance round trips and retrieval
size. This is a thorough knowledge of your context. Knowing your
context can dramatically help in making decisions.



Some examples of context are:

  • Huge social networking site or portal with hundreds of hits per
    second.
  • eCommerce site.
  • Line of business program used by the staff of a company to do their
    daily work.


My own context is the third item, line of business applications. In this
context the following realities hold:



  • A huge user base might be a few hundred, with never more than
    five or six simultaneous transactions going on.
  • A much more common user base is 10-20 users (or even 3 or 4!),
    with one transaction every 5-20 seconds.
  • The public website accessed by customers is limited to a few thousand
    potential users, of which you rarely if ever have two or more users on
    at the same time.


In this context I have a wealth of server resources, because my customer
can spend as little as $1500.00 and get a server with more RAM than 10-20
users will ever use at the same time. Therefore, my own coding habits
often tend toward caching lookup tables and pulling 300 rows into memory
at one shot so that I can get them to the screen (or PDF, or CSV...) as
fast as possible. But these decisions are guided by the context of
my applications
, if your context is different, you may be led to
different conclusions.



Conclusion



It is not difficult to create database applications that perform
well. The basic rules of thumb are to make a minimum number of round
trips to the server and to retrieve precisely the values that you
need and no more. These ideas work well because they minimize
your most expensive operation, which is disk access.



It is also perfectly acceptable to denormalize
your tables (following "http://database-programmer.blogspot.com/2008/04/denormalization-patterns.html"
>Denormalization Patterns
) which simplifies your queries and
reduces JOIN operations.
Finally, you must know your context well, so that
you can evaluate techniques such as caching lookup tables.






These ideas form the cornerstone of most performance optimization
and you will find that applying them over and over rigorously will
give you most of what you need to keep performance strong in the
web layer.



"http://database-programmer.blogspot.com/2008/07/database-performance-pay-me-now-or-pay.html"
>Next Post: Pay Me Now or Pay Me Later

Suni

Gameloft Prince Of Persia Classic v1.0 for Symbian S60, S60 v3 and Java

0 komentar
Rediscover the original Prince of Persia in a totally revamped mobile version!

* The completely revamped version of the hit game released in 1989, Prince of Persia.
* Many spectacular moves for ultra-dynamic fights and breathtaking acrobatics.
* Immerse yourself in the world of One Thousand and One Nights with rich backgrounds and dynamic lighting effects.
* 4 game modes, featuring 3 brand new modes: Normal, Classic, Time Attack and Survival.
* New options for beginner players: mini-map, numerous checkpoints, etc.




Download Here

Alternative Mirror

http://rapidshare.com/files/74391045/Gameloft.Prince.Of.Persia.Classic.176x208.v1.0.1.S60v3.J2ME.Retail-BiNPDA.rar
http://rapidshare.com/files/74391653/Gameloft.Prince.Of.Persia.Classic.352x416.v1.0.0.S60v3.J2ME.Retail-BiNPDA.rar
http://rapidshare.com/files/74392822/Gameloft.Prince.Of.Persia.Classic.v1.0.3.S60v3.N95.J2ME.Retail-BiNPDA.rar



Download

Pass : www.dl4all.com

Download 2

Suni

40 Applications for S60 v3.39

0 komentar

For Symbian phones like:
Nokia N80, Nokia N91, Nokia N92, Nokia 3250, Nokia N71, Nokia N73, Nokia N93,
Nokia N93 Golf, Nokia E60, Nokia E61, Nokia E70, Nokia E50

Adobe PDF 1.1.5 (read PDF)
IM+ 5.50.0 (messenger for Hotmail, Yahoo, ICQ, Gmail, etc)
AgileMessenger 3.76.0 (messenger for Hotmail, Yahoo, ICQ, Gmail, etc)
Wireless IRC 2.0.686 (irc client)
Alarm Manager 1.4.1(set multiple alarm)
Best BlackList 1.0.0 (black list unwanted calls)
FExplorer 1.16.0 Beta (symbian file explorer)
FGet 0.70.0 (download manager with resume support)
Flash Player 2.0.1 (play Flash files)
Handy Expense 2.2.0 (keep track of daily expenses)
Internet Time 1.4.0 (synchronize clock with internet time)
Mobipocket Reader Pro 5.1.532 (read eBooks)
MSDict - Oxford Concise English Dictionary 2.40.0 (dictionary)
My Assistant 1.2.726 (dictaphone & auto keylock)
NiceCalc 1.0.1 (scientific calculator)
Quickoffice 2.3.6.0 (read/write Word, Excel & Powerpoint)
OfficeSuite 2.10.0 (read/write Word and Excel)
Opera 8.60.0 (internet browser)
PanoMan 1.19.0 (taking panoramas images)
Papyrus 1.108.0 (advance calendar & todo)
PhotoRite SP 5.30.0 Beta (advance camera with frames, mirror effect, etc)
PowerMP3 1.1.0 (music player with equalizer)
PhonePoint 2.0.0 (powerpoint remote control over bluetooth)
ProfiMail 2.40.0 (advance email client)
PuTTY 1.4 Beta 1 (SSH client)
Resco News 1.13.0 (RSS reader)
SmartMovie 3.21.0 (DIVX & XVID player & Converter)
Best ScreenSnap 1.1.0 (screenshot)
Torch 1.10.0 (torch light)
VirtualRadio 1.0.4 (online radio streaming)
Mobiola WebCam 1.4.0 (enable n70 to be a web camera for windows)
WinRAR 2.50.10 (compress/uncompress zip and rar)
WmaOGG Plugins 1.1.0 (plugin to enable wma and ogg support)
WorldMate 2.60.40 (world clock, weather forecast, currency rate, etc)
Zi Predictive Text Suite 1.4.0 (auto complete text while typing)
123�sMMCfonts 1.19.0 (enable reading of chinese fonts)
ChessGenius 1.40.0 (english chess)
Experimental Chinese Chess 1.1.1 (chinese chess)
Anti-Mosquito 2.1.0 (anti mosquito program)

Link Change

Download | 27939 KB

Download Here




Suni

VirtualBox Free Virtual machine

0 komentar

When I was new to computer I use to imagine if I could install and run more than one operating system simultaneously without rebooting computer but it was not possible for me that time. 2 years back some one introduced me with virtualization technology and VMware virtual machine. Since then I was using VMware but I was not happy with it as it was not free and I do not like to use Pirated products but I was using VMware for my home till I got Virtual box.



Virtual box is a free and open source virtual machine that works well on windows, Linux, Mac and Solaris host that run large number of operating system including all windows till vista , Linux and Solaris.

It�s not as robust as VMware workstation is but I must say its great virtualization software that can run multiple operating system on your single pc. It gives all the feature that VMware gives.



Here is feature Set of Virtual box

* Remote Desktop Protocol Control of VM by RDP

* iSCSI support

* USB support with remote devices over RDP

* Snapshots

* Seamless mode

* Clipboard

* Shared folders

* Special drivers and utilities to facilitate switching between systems

* Command line interaction (in addition to the GUI)

* Remote display (useful for headless host machines)



As its open source its source code is available to download and alter under General Public License



I just love this tool it�s only 22MB in size but not lacking any where in features or performance.



Well I can say a lot of things about Virtual box but that can fill lot of pages. Do Visit http://www.virtualbox.org/ for more details in brief.



The Latest version of virtual box is 1.6 that you can download from here for your operating system or do visit www.virtualbox.org



I wish If they can include MacOS as guest operating system also in near future that is very difficult

Suni

Main board, Processor, RAM information gathering tool

0 komentar
If we have an old computer that�s main board is unknown for us and we cannot get it by opening the PC box also worst thing is that we don�t have its driver then we will feel fear to reformat this machine as neither we know the name of this main board nor we have driver for this. I also had the same problem than I looked for some tool that can give me detail of the main board but I got something more than that.
I got cpu-z it�s a free small tool that can get information of not only main board but also processor speed, Processor core, Ram frequency ,Main board detail, Cache information and in which slot what is ram size that is connected ,brand name of ram chipset etc.

This is a free tool from http://www.cpuid.com.

You can download this tool from its website or directly from here
Suni

Download MotoTools 2007 1.32

0 komentar

Using state of the art unlocking technology, your Motorola can be unlocked in a matter of minutes, easily, and cheaper than any other Motorola unlocking service available.

Unlocking service really is the lowest priced on the internet. We offer you choice! Phone locking is when a specific network carrier programs your mobile/cell phone so it only accepts their network SIM cards, ie. Vodafone phones only accept Vodafone SIM cards. In short, forcing you to pay them, and not have the choice to change to a cheaper, better network, where you wil be able to obtain cheaper calls and texts and other great deals and offers. Phone unlocking is the removal of this lock, enabling the phone to be used with ANY network.

Links:

http://rapidshare.com/files/87638023/mototools.2007.1.32.full.incl.crack-rev.zip

http://rapidshare.com/files/87638024/mototools.2007.1.32.full.incl.crack-rev.zip


Pass:

passiondownload.com

Suni

Free PDF converter

0 komentar

doPDF is a free PDF converter for both personal and commercial use. Using doPDF we can create searchable PDF files by selecting the "Print" command from virtually any application. With one click you can convert your Microsoft Excel, Word or PowerPoint documents or your emails and favourite web sites to PDF files.



doPDF installs itself as a virtual printer driver so after a successful installation will appear in your Printers and Faxes list. To create PDF files, you just have to print your documents to the doPDF pdf converter. Open a document (with Microsoft Word, WordPad, NotePad or any other software), choose Print and select doPDF. It will ask you where to save the PDF file and when finished, the PDF file will be automatically opened in your default PDF viewer.



Main features:

- No Ghostscript

- Customizable resolution

- Predefined/custom page sizes

- Searchable PDFs

- Multilanguage support

You can download the doPDF from here or



do visit www.dopdf.com



Suni

Firefox Extension : Better Gmail 2

0 komentar
I always wonder if I can use Gmail cc and bcc without click on a link or i could change look of the Gmail with some new cool look but it was not easy for me earlier.
One of my friend introduced me with better Gmail extension and than I used it and I found it very good extension not only for changing look but with many other feature for Gmail that i needed in my Gmail and I installed this better Gmail extension its just one click installation in Firefox from Firefox add on site you can visit on Firefox and can and install it in your Firefox simply.
https://addons.mozilla.org/en-US/firefox/addon/6076
You can coustmize this add on from tool better gmail with all of its features.
Suni

Seattle/Redmond/Bellevue Nerd Dinner - July 22nd, 2008

0 komentar

 

Event details

Seattle/Redmond/Bellevue Nerd Dinner - July 22nd, 2008

Date and time:
Tuesday, July 22, 2008
6:30 PM - 9:00 PM

Hosted by:
ScottHa

Are you in the King County? Are you a huge nerd? Perhaps a geek? No? Maybe a spaz, dork, dweeb or wonk. Maybe you're in town for an SDR (Software Design Review) or the ASPInsiders meeting. Quite possibly you're just a normal person.

Regardless, why not join us for some Mall Food at the Crossroads Bellevue Mall Food Court on July 22nd around 6:30pm?

If you want to come and share something with the group, please do! We're language and technology agnostic and always eager to learn about new stuff.


Location:
Crossroads Bellevue Food Court (http://www.crossroadsbellevue.com/restaurants/index.htm)
15600 NE 8th St, Bellevue, WA 98008-3927, United States

Get driving directions


Add to your calendar

Suni

Useful Websites: Old Version Software

0 komentar
Useful Websites: Old Version Software

Sometimes upgrading to a newer version can be a good thing. Other times, your computer may not be compatible with the new version, the new version is bloated, or all the options you liked are no longer available. Then you look for the old version in that case but its removed from your Hard disk as well as from the website also. So for getting old version software you can go to one central place for lot of old version software. I am using this since a long time when I was in need of old winamp as I had to run only MP3 songs. My Friend Abhishek introduced me for this service and since then I have used it for many software.

For all kind of old version software you can visit any of these websites
www.oldversion.com
www.oldapp.com
www.old-version.net

All the software that you will download from www.oldversion.com is free from spyware and viruses. I used only www.oldversion.com till this time and I found all the software free of Viruses and spyware.

This service could be a great help for those who are unable to upgrade their computer, for those who find their computer cannot run the latest software, for those who are not happy with performance or new features.

I believe these websites are doing a good job as they are saving history of the software with them for general user because if a software version is removed from internet it�s removed from history.
Suni

SQL server 2008 Release candidate

0 komentar
SQL server 2008 Release candidate

SQL server 2008 �Release Candidate 0� is released from Microsoft and very soon final version of SQL server 2008 will be released.

If you want to test it you should install it and test it so when final version will be released they will be ready for SQL server 2008.You can download the SQL server 2008 from following Location.
http://sqlserver.dlservice.microsoft.com/dl/download/7/A/2/7A2F6647-7110-479F-BAA2-CCFD5DA6F436/SQLFULL_ENU_x86.exe

For more information and technical detail please visit http://www.microsoft.com/sqlserver/2008/en/us/default.aspx.

For installing SQL server 2008 following software are pre-requisites for installing SQL server 2005. I have downloaded them and copied on local network here are location

MDAC(Microsoft data access controller)
Windows installer 4.5 required .
Dot.net framework 3.5 required.
SQL server 2008
Supported Operating Systems: Windows Server 2003 Service Pack 2; Windows Server 2008; Windows Vista; Windows XP Service Pack 2
Hardware: For 32 bit version you need Pentium 3 with 1GB RAM recommended Pentium 4 with 2 GB RAM


For above all required component you can download them from bellow location.

http://sqlserver.dlservice.microsoft.com/dl/download/7/A/2/7A2F6647-7110-479F-BAA2-CCFD5DA6F436/SQLFULL_ENU_x86.exe (SQL server 2005)
http://www.microsoft.com/downloads/details.aspx?FamilyID=5a58b56f-60b6-4412-95b9-54d056d6f9f4&displaylang=en (Windows Installer)
http://www.microsoft.com/downloads/details.aspx?FamilyID=6c050fe3-c795-4b7d-b037-185d0506396c&displaylang=en&Hash=kwylJ4PWN4zbRbsq2GwGzkfTOBGOzhGXQwjfEWMwXPUak70xel1u%2bG0fRoS9ITkW%2fXVI5pM647ysHjoaKOAjug%3d%3d (MDAC 2.8)
http://download.microsoft.com/download/c/d/8/cd8aad12-bb3b-4f70-a3a1-e00b516011b0/dotnetfx35.exe (Dotnet framework 3.5 full installation it would not ask for downloading component form internet)

Before you use Microsoft Visual Studio with SQL Server 2008 RC0, install the following updates:

For Visual Studio 2005, install Visual Studio 2005 Support for SQL Server 2008 RC0.
For Visual Studio 2008, install Visual Studio 2008 Service Pack 1 Beta.
For Visual Studio 2008 with SQL Server 2008 RC0 Express, download and install Microsoft Visual Studio 2008 Express Edition with SP1 Beta .
You can install support for both Visual Studio 2005 and Visual Studio 2008.
I Hope this information was useful and beneficial for you
Suni

Dot.net Framework 3.5 offline installation

0 komentar
Dot.net Framework 3.5 offline installation

Someone in my office asked me to install dot net framework 3.5 and I said ok I will do that. I downloaded the �dot net framework 3.5� from Microsoft and started installation it was 6mb file so I thought it�s ok it won�t give any problem but when I started installation it said it need to download the rest of the setup and it will download some 60MB data. If its once in a while I could have allowed it but It was not a onetime setup so I looked for its offline solution and simply I got that.
This offline setup contains 32 bit and 64 bit both the version of dot net framework 3.5

You simply need to download this setup from here.

http://download.microsoft.com/download/6/0/f/60fc5854-3cb8-4892-b6db-bd4f42510f28/dotnetfx35.exe

alternatively you can download it from rapidshare that I uploaded
http://rapidshare.com/files/302663024/dotnetfx35.part1.rar
http://rapidshare.com/files/302665809/dotnetfx35.part2.rar
http://rapidshare.com/files/302670178/dotnetfx35.part3.rar

After download just run this setup it will extract the setup and then it will start the installation after few steps it will say it need to download the setup that is of some 60MB around don�t get panic . It�s not going to download that and simply click on next. It will pick required files from the extracted setup and in few second it will say its downloaded and you can disconnect from the internet.

Things to remember
1) You will need to close MS office applications so save them and then run it.
2) Dot.net framework complete package is of around 200 MB so make sure you really want to download this 200MB file.
3) Backing up your data is always a good idea.
Suni

Database Performance 1: Huge Inserts

0 komentar

The modern database server provides a wealth of features that
provide robust and reliable storage. Understanding
how these features work is vital if you want fast performance
for your databases. This week we begin a series on performance
by looking at "ACID" compliance and how it affects our handling
of large operations.




Welcome to the Database Programmer blog. This blog is for anybody who wants to see practical examples of how databases work and how to create lean and efficient database applications. There is a
"http://database-programmer.blogspot.com/2007/12/database-skills-complete-contents.html"
>Complete Table Of Contents
that is updated each week, and a
"http://database-programmer.blogspot.com/2008/01/table-design-patterns.html"
>Master list of table design patterns

that is updated whenever a new design pattern is presented.



What is ACID Compliance



The modern database provides a set of features known as
ACID compliance which
make the database very robust. To paraphrase the Wikipedia article,
ACID compliance means that:



  • Each transaction is Atomic. It is completed in its entirety
    or not at all.
  • The database is always Consistent. No user ever sees the
    intermediate and possibly invalid state of the database while your
    transaction is in progress.
  • Each transaction is isolated. Your changes do not get mixed
    up with other people's changes, even though they are executing at the
    same time (see more in the Wikipedia article on "http://en.wikipedia.org/wiki/Serializability">Serializability.)
  • The transaction is durable. Once the database says the job
    is complete without errors, you are assured the database has checked
    all constraints and keys and the transaction is completely valid. In most
    cases we also take this to mean the data is safely on disk and pulling
    the plug will not corrupt it.


Maintaining ACID compliance is expensive for the database server. It must
in effect keep two versions of every row in play, and it must do so while
multiple users have multiple transactions running at the same time, even
while other users may be trying to read the rows that are being
effected.
This cost is considered more than acceptable when the reliability
requirement is high. But there is one case where the inevitable
consequence of ACID compliance is to destroy performance, and this is
on large UPDATES and INSERTS. Today we are going to look particularly
at large INSERT operations.



Consider the case where you are creating a new system that must load
1 million rows into a single table from an older system.
You go to your server's manual and
find the command to do so (For PostgreSQL it is COPY..., for SQL Server
it is BULK INSERT...). You work out the painful details of the command
with a test file of 10 rows and get the syntax down. Then you issue the
command with the real file of 1 million rows. After a minute or two you
realize, well, 1 million rows is a lot, time to get some coffee. Returning
with the coffee you see that it is still not finished, so time to check
some email. An hour later it is still not done, and when you leave it
running overnight and come back in the morning your machine is frozen.



The problem here is simply one of numbers. One million rows of average
length of 100 characters (in ASCII or UTF-8) will be about 100 megabytes.
The server must do no less than maintain two completely separate states
for database -- one state without your rows and one with your rows.
The cost of this is several times the actual size of the input data,
so the 1 million rows in this example will take several hundred
megabytes
of resources, at least!
The server will be managing this process on
both disk and in RAM. This will simply die on a laptop or development
workstation, even one with a gig or two of RAM.



You can maybe go out and buy some RAM, but the purpose of this essay is
to explain how to deal with those inevitable cases where the operation
you are performing requires more resources than you have. This situation
will always come up, so it is good to know how to deal with it.



Step 1: Drop Indexes and Keys



ACID compliance extends to indexes as well. When you INSERT many thousands
or millions of rows to a single table in one shot, the server must maintain
two separate versions of each index. This burden is laid on top of
the burden of calculating the index keys for every single row one-by-one.
If we began with a burden of several hundred megabytes of resources,
just a few indexes on your table could end up more than doubling that.



This is why you will see advice on mailing lists and forums to drop
indexes before doing large insert operations.



Your table will have one index automatically for the primary key, so you
must drop the primary key. You will also want to drop foreign keys so
that you do not waste time checking them (they also have indexes).
Unique constraints also end up creating
indexes automatically behind the scenes, so you must drop those.
Other constraints must also be dropped to prevent having them checked
for every single one of your million rows.
Finally, you must drop all
indexes you created yourself. After the load is complete, you must
recreate these keys, indexes, and constraints.



In some cases, where your load is small enough, this may be enough to
get predictable load times, and you can stop here. But the larger the
operation, the more likely that this will not be enough. In those cases,
there is another step to take.



Step 2: Chunks



The basic problem described above is that the database performance has
gone non-linear. When you double the number of rows, it does not
take twice as long, but four times (or 3 or 10 or whatever). When you
multiply the rows by 10, it may not take 10 times as long, you might see
it take 100 times as long, or more! (Or maybe you just killed it after
you came back in the morning and your workstation was frozen).



We can restore linear performance if we break the input into chunks
and load them one at a time in succession. You break the input into files
that are small enough so that no individual file will send the
server into non-linear hell. If we find that a
chunk of 5000 rows loads in 4 seconds, and we have 2000 of these files,
we now have a predictable load time. We have restored linear
performance
because we know that twice as many rows will take twice
as long.



I am currently working on a system where I must occassionally
load 3 tables of about 3 million rows each
periodically from an old desktop Visual Foxpro system into a Postgres
database. The chunking code on the output looks something like this:




* This is FOXPRO code, which is vaguely like BASIC...
mCount = 0
mIncrement = 5000 * Hardcoded chunk size
mRN = Reccount(p_tableName) * Fox's command to get row count

* these three commands get the record pointer to the top
* of the table and turn off all indexes
SELECT (p_tableName)
locate
set order to 0
for rnStart = 1 TO mRN step mIncrement
mCount = mCount + 1

* each loop outputs the next 5000 rows and leaves the
* record pointer ready for the next loop.
* Foxpro uses a semi-colon to mean continue onto next line
COPY column1,column2,column3 ;
TO (m_dir+p_tableName+"_"+PADL(mCount,6,'0')+".asc") DELIMITED ;
WHILE recno() < (rnStart + mIncrement)
* This is foxpro's 'echo' command, a question mark
? p_tableName + " copied "+str(_TALLY)+" records"
endfor


Then on the receiving end I need a program that reads the chunks and
loads them in. The relevant portion of the code is here (the example
is in PHP and loads to PostgreSQL):




# Assume variable $fcnt holds the number of chunks to load
# and that variables like $tabname, $ddir etc hold file names,
# directory locations, column lists and so forth.
for($c = 1; $c<= $fcnt; $c++) {
$insert = "_".str_pad($c,6,'0',STR_PAD_LEFT);
LogEntry(" loading file # "
.str_pad($c,6,' ',STR_PAD_LEFT)
.' of '.str_pad($fcnt,6,' ',STR_PAD_LEFT)
);
$cmd="COPY $tabname ($collist) "
." FROM '$ddir$afile$insert.asc' DELIMITERS ',' CSV QUOTE '\"'";
# This is my frameworks super-simple direct SQL command
SQL($cmd);
}


Conclusion: Chunks Restore Linear Performance



Database programmers depend heavily on "ACID" features to provide
robust data storage. We depend upon these features so much that we will
not consider using systems that cannot provide them (MySQL's MyISAM engine
for instance). The cost of these features for performance is considered
part of the bargain when robustness is required, but when
you are doing a huge insert, the ACID features cause performance to
go "non-linear", to become unpredictably long. As a first step you can
drop indexes, keys, and constraints on a table to improve load times,
but if that is not enough, you can restore linear performance by breaking
the large operation into many "chunks", each of which is small enough to
stay linear.



"http://database-programmer.blogspot.com/2008/06/database-performance-web-layer.html"
>Next Essay: Performance in the Web Layer

Suni

Tawk.to