Welcome, Guest. Please login or register.

Author Topic: Brain haz can I? Need help inserting massive ammount of data into a table.  (Read 2219 times)

Description:

0 Members and 1 Guest are viewing this topic.

Offline whabangTopic starter

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 7270
    • Show only replies by whabang
Hi,
This isn't related to the Amiga at all, but since many of you are quite competent in these matters, I though this might be a good place to ask:

I'm currently designing a time reporting system for a small company. The idea is to let the drivers report their working hours over the web, instead of snail-mailing their time reports to the main office located in the other end of the country.

Now this should be simple enough; however, they've requested that the web site should look as similar to their own time report cards as possible, and this means that I need to post up to 465 sepearate values from a single web-site at once (15 collumns, and 31 days per month).

Here's the dillema: I need a way to be able to edit all this information through a web-interface. and then save the changes back into the database. It's rather easy to do a row-by-row sollution, saving and loading the userID and date separately, and then the rest of the information together in a single cell using PHP's serialization functions.
The real problems arise when you try to do this to 31 rows at once. How do I attach the date-stamps and the rows to eachother?

There's probably an easy sollution to this, but my brain-cells decided to go on strike, and no ammounts of coffee seems to be able to lure them back to work.

Any suggestions?
Beating the dead horse since 2002.
 

Offline ZeBeeDee

  • Hero Member
  • *****
  • Join Date: Jan 2007
  • Posts: 1081
    • Show only replies by ZeBeeDee
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #1 on: September 23, 2008, 09:28:24 PM »
Don't use 31 days ... Split it up into 4 weekly segments.They enter their data every day, not at the end of the month.

Much easier data handling  :-)
To err is human ... to BOING divine!

[SIGPIC][/SIGPIC]
 

Offline whabangTopic starter

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 7270
    • Show only replies by whabang
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #2 on: September 23, 2008, 09:38:13 PM »
Agreed, but since they specifically asked for a system that allows them to edit all days at once, that unfortunately doesn't do me much good.
Beating the dead horse since 2002.
 

Offline ZeBeeDee

  • Hero Member
  • *****
  • Join Date: Jan 2007
  • Posts: 1081
    • Show only replies by ZeBeeDee
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #3 on: September 23, 2008, 09:43:29 PM »
at the end of the month, just add up all the data
To err is human ... to BOING divine!

[SIGPIC][/SIGPIC]
 

Offline whabangTopic starter

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 7270
    • Show only replies by whabang
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #4 on: September 23, 2008, 09:49:38 PM »
It's either me, not understanding what you're saying, or the other way around. :-D

Basically, I need to write a page that reads 465 values from a database, displays them all in form fields to allow editing, and then saves everything back when the save button is pressed.

The problem is that I have no clue about how to match the form fields with their corresponding cells once I save it all back to the database.
Beating the dead horse since 2002.
 

Offline ZeBeeDee

  • Hero Member
  • *****
  • Join Date: Jan 2007
  • Posts: 1081
    • Show only replies by ZeBeeDee
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #5 on: September 23, 2008, 10:06:08 PM »
Ahhhh ...
To err is human ... to BOING divine!

[SIGPIC][/SIGPIC]
 

Offline amigadave

  • Lifetime Member
  • Hero Member
  • *****
  • Join Date: Jul 2004
  • Posts: 3836
    • Show only replies by amigadave
    • http://www.EfficientByDesign.org
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #6 on: September 23, 2008, 10:09:53 PM »
It sounds to me like you need a simplified web page that has only a login and password fields for each driver to fill in, which then redirects to a calendar page where the driver can enter his hours for each day worked on the dates they were worked.  This calendar page would then send the information to your database each time the driver pressed the save button.

The calendar page would need a script tied to the save button which sends the information to the company's server and on to the database.

In that way the company could use a program like Quickbooks to receive the information and put it in a format the company could use for payroll and tax purposes.  That is, if there is a Quickbooks version for your language and country's tax system?
How are you helping the Amiga community? :)
 

Offline cicero790

  • Sr. Member
  • ****
  • Join Date: Mar 2008
  • Posts: 322
    • Show only replies by cicero790
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #7 on: September 23, 2008, 10:15:46 PM »
Perhaps you can look at the code in these Joomla extensions.
A thought.
http://extensions.joomla.org/component/option,com_mtree/task,listcats/cat_id,1828/Itemid,35/


EDIT you have to paste the complete adress above all the way to 35/. the linking does not work.
A1200 030 40MHz: 2/32MB Indivision AGA MkII
A600 7 MHz: 2MB
AROS 600 MHz
PC 13600 MHz: quad core i7 2600K 3.4GHz: 16GB RAM: ATI HD6950 2GB   (Yes I know)

WINUAE AmiKit ClassicWB AmigaSYS UAE4Droid  

 

Offline cv643d

  • Hero Member
  • *****
  • Join Date: May 2003
  • Posts: 1197
    • Show only replies by cv643d
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #8 on: September 23, 2008, 10:21:47 PM »
Use an array, load it up with 465 values from the DB, echo it out into form fields "
When user hits "submit", catch the action from the queryline and loop through the array and update every value in the DB, ok I admit kinda unoptimized to make 465 queries in a row (but good enough, hopefully :lol: )

But maybe you can set a flag if one field has been updated and only update those fields which has had an update.
Amiga articles
"New shell. It was finished a while back, but I still see bugs, haha" - SSolie
 

Offline motorollin

  • Hero Member
  • *****
  • Join Date: Nov 2005
  • Posts: 8669
    • Show only replies by motorollin
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #9 on: September 23, 2008, 10:38:27 PM »
Edit -

Sorry, learn to read...

--
moto
Code: [Select]
10  IT\'S THE FINAL COUNTDOWN
20  FOR C = 1 TO 2
30     DA-NA-NAAAA-NAAAA DA-NA-NA-NA-NAAAA
40     DA-NA-NAAAA-NAAAA DA-NA-NA-NA-NA-NA-NAAAAA
50  NEXT C
60  NA-NA-NAAAA
70  NA-NA NA-NA-NA-NA-NAAAA NAAA-NAAAAAAAAAAA
80  GOTO 10
 

Offline whabangTopic starter

  • Hero Member
  • *****
  • Join Date: Mar 2002
  • Posts: 7270
    • Show only replies by whabang
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #10 on: September 23, 2008, 10:50:44 PM »
Quote

cv643d wrote:
Use an array, load it up with 465 values from the DB, echo it out into form fields "
When user hits "submit", catch the action from the queryline and loop through the array and update every value in the DB, ok I admit kinda unoptimized to make 465 queries in a row (but good enough, hopefully :lol: )

But maybe you can set a flag if one field has been updated and only update those fields which has had an update.


That might actually work! Gonna try that tomorrow.
If nothing else, it'll keep the database server busy; almost 900 queries for saving and reloading a single page!  :lol:

I realise that this is an incredibly ineffitient way of doing it, but the customer is always right.  :-D

Thanks for the tips!
Beating the dead horse since 2002.
 

Offline weirdami

  • Hero Member
  • *****
  • Join Date: Jan 2003
  • Posts: 3776
    • Show only replies by weirdami
    • Http://Bindingpolymer.com
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #11 on: September 24, 2008, 06:10:23 AM »
Quote
Need help inserting massive ammount of data into a table.


If the table is made of wood, a pocketknife should work. Take a look inside any high school for examples.
----
Binding Polymer: Keeping you together since 1892.
 

Offline Fester

  • Hero Member
  • *****
  • Join Date: Mar 2006
  • Posts: 586
    • Show only replies by Fester
    • http://www.rdmsnippets.com
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #12 on: September 24, 2008, 07:33:48 AM »
900 queries per page sounds complicated.

Well here's a bunch of assumptions:

If I understand correctly, you're displaying one month of data on a single web page, and that can involve up to 465 cells of data? So what is it, 31 fifteen hour days displayed as a sort of grid?

What kind of control are you using to display these 465 cells?

If you're using simple input text boxes, why not name each box by row and column number? That would give each cell a unique name.

If you're "posting" this data back to the database, is it possible to request each cell by name?

I'm assuming your grid will have a start date and a start time (hour?) too. With that, there's no need to associate each dates with all cells. You should be able to determine in your post that input box named rowx-columny of value such and such corresponds to date such and such and time slot such and such, assuming every column increments by the same time interval and each row increments by the same date interval or vice-versa.

Or something or rather...
 

Offline Karlos

  • Sockologist
  • Global Moderator
  • Hero Member
  • *****
  • Join Date: Nov 2002
  • Posts: 16879
  • Country: gb
  • Thanked: 5 times
    • Show only replies by Karlos
Re: Brain haz can I? Need help inserting massive ammount of data into a table.
« Reply #13 on: September 24, 2008, 10:52:38 AM »
Don't the rows of your table have a primary key you can use for identification?

If you are inserting new rows, there are ways to optimize it into a smaller number of queries.

For example, most SQL implementations support a multiple insert:

INSERT INTO my_table (foo, bar) VALUES (5, "fish"), (33, "lobster"), (17, "football") ...

When you get your form data back you can construct a string akin to the one above and issue it as your query. You might want to limit it to submissions of no more than a few hundred inserts at once, depending on how much data there is.

If you are updating rows, that comes down to having to loop over each row from the form and update the corresponding one in the table. You need to use some unique column (or group of columns) to match against. If your table has a primary key, use that.

In a voice minutes system I wrote and optimized at work, I have to deal with selections of literally tens of thousands of rows from tables containing millions and turning it into various kinds of reports.

One of the operations involves assigning a price to each call record for a given client, which is tantamount to updating every row in a temporary table that has been determined to be an external call (worked out during the generation of that table) associated with it. So you are looking at about updating 1/3 to 1/2 of the entire dataset which can be over 100K rows. This was using MySQL as the backend.

I found the fastest solution by far, in this case, was to SELECT all the required rows, process them in the code to assign the price and write the result row to a temporary CSV file that has the same column structure as the temporary table.
Next I delete all the rows from the temporary table I just processed. This is an especially fast step since the query cache already knows which rows were affected by the WHERE clause from the previous SELECT statement used to grab the data.
Finally, I use a LOAD DATA INFILE statement to pull the temporary data back into the table. Obviously you need the FILE permission for this to work.

This was literally hundreds of times faster than updating each row one by one.
int p; // A