In the years that I've been at my place of employment, I've noticed a distinct trend towards something that I consider an anti-pattern: Maintaining internal data as big strings of XML. I've seen this done a number of different ways, though the two worst offenders were quite similar.
The first application, a web service, provides access to a potentially high volume of data within a SQL database. At startup, it pulls more-or-less all of that data out of the database and stores it in memory as XML. (Three times.) The owners of this application call it a cache. I call it slow, because every perf problem that's been run into while working against this has been directly traceable to this thing. (It being a corporate environment, there should be no surprise that the client gets blamed for the perf failure, not the service.) This application does use the XML DOM.
The second application reads an XML file that was generated as the result of an export from a third-party database. The goal is to import this data into a proprietary system (owned by us). The application that does it reads the entire XML file in and maintains at least two, sometimes as many as four, copies of the XML file throughout the entire importing sequence. Note that the data can be manipulated, transformed, and configuration can occur before the import takes place, so the importer owns this data in an XML format for it's entire lifetime. Unsurprisingly, this importer then explodes when a moderately sized XML file is provided. This application only uses the XML DOM for one of it's copies, the rest are all raw XML strings.
My understanding of common sense suggests that XML is not a good format for holding data in-memory, but rather data should be translated into XML when it's being output/transferred and translated into internal data structures when being read in and imported. The thing is, I'm constantly running into production code that completely ignores the scalability issues, and goes through a ton of extra effort to do so. (The sheer volume of string parsing in these applications is frightening.)
Is this a common failure to apply the right tool for the job that others people run into alos? Or is it just bad luck on my part? Or am I missing some blindingly obvious and good situations where it's Right and OK to store high volumes of data in-memory as XML?
What's the effective technology to use for a slick UI in an ASP.NET application?
Run methods as service c#?
How to layout the code of a simple game?
The higher volume of data we are talking about, the more important this becomes.
Xml is a hugely bloated format that reduces performance.
What features are the most important for data-bound grid controls
Xml should be used only for transfering data between applications.
Function with variable number of args in C and a design-oriented question
Refactoring domain logic that accesses repositories in a legacy system
Producer and consumer problem in Haskell?
For your first example, the database should handle almost all the caching, so storing all the data in program memory is wrong.
This applies whether it's stored in-memory as XML or otherwise.. For the second, you should convert the XML into a useful representation as soon as possible, probably a database, then work with it that way.
Only if it's a small amount of data would it be appropriate to do all work in-memory as a XmlDocument (e.g.
String parsing should be used very sparingly..
I would like to add that when you join any existing project, you are likely to find some design and implementation decisions that you disagree with.. We all learn new things all the time and we all make mistakes.
Though I agree that this seems like a "duh" kind of problem, I'm sure the other developers were trying to optimize the code through the concept of a cache.. The point is, sometimes it takes an gentle approach to convince people, especially developers, to change their ways.
This isn't a coding problem, but a people problem.
You need to find a way to convince these developers that these changes you are suggesting don't imply they are incompetent.. I'd suggest agreeing with them that caching can be a great idea, but that you'd like to working on it to speed up the functions.
Create a quick demo of how your (way more logical) implementation works compared with the old way.
It's hard to argue with dramatic speed improvements.
Just be careful about directly attacking the way they implemented in conversation.
You need these people to work with you.. Good luck!.
. ...but grabbing for straws, the only use I could see for data being stored as XML is for automated unit tests, where XML provides an easy way to mock up test data.
Definitely not worth it, though..
The COM object could take either xml or a class.
The interop overhead to fill each member of the class was way too large and processing xml was a much faster alternative.
We could have made a c# class identical to the COM class, but it was really too difficult to do in our timeframe.
So xml it was.
Not that it would ever be a good design decision, but when dealing with interop for huge data structures, it was the fastest we could do.. I do have to say that we are using LinqtoXML on the C# side, so it makes it slightly easier to work with..
that will speed up your data access. Objects are in most cases easier to work with, They give a better picture of your domain, etc.. I am not against using xml but it is like patterns, they are a tools that we should understand where and when to use them, not fall in love with them and try to use them everywhere....
I never stored the XML as a string (or multiple strings).
I just parsed it into a DOM and worked with that.
THAT was helpful.
. I've imported XML sources into the DOM (Microsoft Parser) and kept them there for all the required processing.
I'm well aware of the memory overhead the DOM causes, but I found the apporach quite useful nonetheless.
- Some checks during processing need random access to the data.
The selectPath statement works quite well for this purpose.
- DOM nodes can be handed back and forth in the application as arguments.
The alternative is writing classes wrapping every single type of object, and updating them as the XML schema evolves.
It's a poor (VB6/VBA) man's approach to polymorphism..
- Applying an XSLT transformation to all or parts of the DOM is a snap.
- File I/O is taken care of by the DOM too (xmldoc.save...).
All the search and I/O functionality I would have to code myself.. What I've perceived as the anti-pattern is actually an older version of the application, where the XML was parsed more or less manually into arrays of structures.
As a 'Frozen Stream'.. There is also a video of this, and other presentations given at XML Prague 2009 here.. link text.
The framework takes care of gathering the events and distributing them to the appropriate handlers.
A 3rd party can easily define its own additions to the format, and provide appropriate generators and handlers.. The important part here is that the framework has to forward the XML with all the XML information intact from the generator to a handler.
In this case implementing an internal data structure which captures all the necessary information results in a re-implementation of most of XML itself. Hence, using an appropriate DOM framework for internal data representation makes sense..