Question:
What type of application should I use for large XML data processing?
Jen
2007-05-15 09:37:03 UTC
I am using Flash to create a form application where people can select up to 4 querying options (using comboboxes), hit a submit button and retrieve data from an XML file based on their selections. The data they receive becomes a list of titles linked to corresponding PDFs that are located on a CD. All of this will eventually be on a CD, hence the reason for Flash and not PHP or something web-based. Because of usability reasons I am pretty limited in my options, but I am having such a frustrating time with Flash because of the size of the XML file and other factors (mainly I don't agree with actionscript for this kind of project).

Does anyone know of a better way or have any suggestions? VB? AJAX? C? At this point, I don't really care what language, but if someone has done anything similar and thinks they might have a better solution than Flash, please let me know.

Thanks,
Jen
Three answers:
jake cigar™ is retired
2007-05-15 10:16:30 UTC
ajax falls apart on a cd, nothing is mime-typed properly to get xml as xml. ajah, still works.



c & vb programs need different versions for different operating systems, mac/xp/linux/vista/dos.



the size of the xml matters to all web based apps, 4meg is the largest that you can be sure will work for all browsers.



Whatever you do, make sure the search is fast, optimized for the data, and doesn't read through the full dataset for each search.



Flash is cute stuff, but will all your users have it installed?



UPDATE:



You would have to break down that 4.3 meg file!!! depending on what it looks like, shorter tag names, more attributes and less tags, or discarding the xml in favor of tab delimited data would work!
?
2016-05-19 02:02:28 UTC
Data profiling is a process to assess current data conditions, or to monitor data quality over time. It begins with collecting measurements about your data, and then looking at the results individually and in various combinations to see where anomalies exist. Data anomalies are the “needle in the haystack” for technology projects. Even the best systems have them, but they may not cause pain until a data migration or integration project comes along. Once the “needles” are identified, the extract, transformation and load process or tools can remove them. Data profiling is attribute, redundancy and dependency analysis. Attribute analysis yields a set of metrics, which can be interpreted to reveal inherent business rules, as well as anomalies embedded in source system data. Redundancy analysis assists in determining source of record, and reduces the occurrences of violated primary keys during integration. Dependency analysis identifies orphan records, and validates a normalized model. Together, these analyses make it possible to interpret data meanings, and implement a structured approach to address and resolve data-related migration and integration issues. Data profiling has a parallel in common business practices. How does your company prove the integrity of its financial position to its owners? “Auditing” is sometimes viewed as a dirty word, but it assures the owners that their decisions are based on reliable financial information. Data profiling ensures that all of your business decisions are based on reliable information.
Java
2007-05-15 09:46:14 UTC
http://www.w3schools.com


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...