Ok, this is really an interesting one. In medical informatics, one challenge that never seems to be conquered is providing a user interface that will not make a doctor grumble.
No matter how hard you try, you almost always here the comment: “this is not so easy to use…”. Medical professionals seem to be very picky when it comes to user interfaces and interaction with information systems. This link here mentions the same thing again. I’ve previously written about Microsoft CUI, and for all of our sakes, it would better be successful. This is a field that is sucking up a huge amount of effort and it is a large setback to adoption of many systems. Being terrible about user interfaces, I am not the one to take things further in this domain, but this is one field which should benefit enourmously from some form of standardization.
Month: May 2008
What is the definition of “user friendly” for a doctor?
Ok, this is really an interesting one. In medical informatics, one challenge that never seems to be conquered is providing a user interface that will not make a doctor grumble.
No matter how hard you try, you almost always here the comment: “this is not so easy to use…”. Medical professionals seem to be very picky when it comes to user interfaces and interaction with information systems. This link here mentions the same thing again. I’ve previously written about Microsoft CUI, and for all of our sakes, it would better be successful. This is a field that is sucking up a huge amount of effort and it is a large setback to adoption of many systems. Being terrible about user interfaces, I am not the one to take things further in this domain, but this is one field which should benefit enourmously from some form of standardization.
Can you build a product for data mining?
I am not sure that the answer to this question is yes. Sure you can find a lot of vendors who’d claim that they already have a product that can miraculously convert raw data into precious information, but is this really the case? I doubt it
My own experience with real life data has shown me that generating information from data is a complex process. It almost always requires understanding of the domain, in which the data is generated, and moreover, the issues with data make the whole thing an ad-hoc process. I’ve asked the Weka mail list about commercial tools that can be considered as alternatives to Weka, and the responses so far confirm my feelings about the domain: you can’t simply throw in a product and expect people to get knowledge out of data. You’d need a substantial amount of knowledge about machine learning and data mining, how these tools and more important algorithms work. Most of the time, you’ll need to adjust many parameters, transform data, and iterate continously till you have a pipeline that connects raw data to some form of decision making process. Sure there are tons of ETL tools, Business intelligence tools etc. These are either infrastrucutre tools for transforming and/or moving data or glorified reporting tools.
The machine learning domain is a moving target with new methods introduced continously, and I can not think of a product that can integrate these methods in a way so that someone with no knowledge of machine learning or data mining can use it to create meaningful information.
I’ve always considered data mining and related decision support domain as a service based industry. Weka seems to be the most suitable tool for this domain, but I have my doubts about scalibility problems when using SVMs, or learning Bayesian Networks from data. I’d like to see how GPU based solutions like Cuda from Nvidia would perform in these cases, and more important than that, would it be possible to parallelize these algorithms for multicore cpus or gpus (maybe even for cell, the horsepower of PS3).
If you can’t scale, you do not have a chance.
This is about Ruby on Rails. Not meant to start a flamewar, and not written to bash it. It is just that some of my concerns are not only mine apparently. When I wrote about Rails framework before, I was fond of the productivity it brought. Especially with the support for Ruby in Netbeans, ROR becomes a nice framework. However, scalability and other stability, enterprise features in general are required in our age. You can go ahead and read about the scalibility features of ROR all around the web, I did so.
What you have to be ready is to be able to quickly provide solutions. You have to have an idea about how to scale from 200 requests a day to 200k in a week. With Java and .NET, I have that. Especially with Java. I can be running on Tomcat, and I can switch to JBoss to have clustering, load balancing, whatever I need. I can use NLB with Windows servers, switch to COM+ etc..
The point is these are very well tested solutions to scaleability problems. They have been proved for a lot of cases. The news is that ROR is having a hard time these days, for a very well known site, Twitter might be abandoning ROR. When you search for scaling Ruby on Rails, you’ll probably come across something that contains references to Twitter.
I’ve switched from ROR to J2EE for my project, and I’m happy I’ve done so. Yes, my backend code is not as efficient to write as Ruby, but I’ve been successful at responding to requests quite quickly, and most important of all, I am facing possibility of scaling to a much larger level, but I have no question marks about how to do that. I guess feeling confident that I have well proven methods to scale matters more some times. I guess JRuby becomes very important in this context since it can run ROR and it can scale using J2EE infrastructure. Let’s see what’ll happen in this front.
If you can’t scale, you do not have a chance.
This is about Ruby on Rails. Not meant to start a flamewar, and not written to bash it. It is just that some of my concerns are not only mine apparently. When I wrote about Rails framework before, I was fond of the productivity it brought. Especially with the support for Ruby in Netbeans, ROR becomes a nice framework. However, scalability and other stability, enterprise features in general are required in our age. You can go ahead and read about the scalibility features of ROR all around the web, I did so.
What you have to be ready is to be able to quickly provide solutions. You have to have an idea about how to scale from 200 requests a day to 200k in a week. With Java and .NET, I have that. Especially with Java. I can be running on Tomcat, and I can switch to JBoss to have clustering, load balancing, whatever I need. I can use NLB with Windows servers, switch to COM+ etc..
The point is these are very well tested solutions to scaleability problems. They have been proved for a lot of cases. The news is that ROR is having a hard time these days, for a very well known site, Twitter might be abandoning ROR. When you search for scaling Ruby on Rails, you’ll probably come across something that contains references to Twitter.
I’ve switched from ROR to J2EE for my project, and I’m happy I’ve done so. Yes, my backend code is not as efficient to write as Ruby, but I’ve been successful at responding to requests quite quickly, and most important of all, I am facing possibility of scaling to a much larger level, but I have no question marks about how to do that. I guess feeling confident that I have well proven methods to scale matters more some times. I guess JRuby becomes very important in this context since it can run ROR and it can scale using J2EE infrastructure. Let’s see what’ll happen in this front.