We share the concerns discussed above, especially regarding the idea of going cloud-based. We are in a rural area, and have routine outages communicating between our on-site server and a remote clinic, despite paying a lot for dedicated T-1 lines. There is no way we would think about going to "the cloud" for our EMR.
One thing we are also concerned about it that we are a specialty clinic (ophthalmology/optometry) and we only have been able to use CPS after hundreds of hours of form customization. So far we have seen nothing with regard to Project Northstar that may address the needs for custom data input forms or custom workflows for specialties such as ours.
Frankly, I would just like GE to add some basic 1990's MS Windows capabilities (e.g. being able to expand a window size in the letter editor to match the width of a piece of paper, the ability to tile one form next to another so our providers can see what was written in another form while they make decisions for their assessment & plan, the ability to correct the spelling of a word while using the spell checker, etc. etc. etc.). These are some the things that make day-to-day CPS EMR usage very difficult and frustrating.
All of the Project Northstar discussion and goals are well and good, but I'm not so confident that they will be able to deliver in a timely manner, given the fact that they haven't even been able to keep up with Internet Explorer releases or given us the ability to upgrade our client machines to Windows 10...
I must claim I do not have much information on Project Northstar except for what I have read thus far on this thread.
Project Northstar is a clear indication that GE is committed to evolving their product through this initiative. This will allow GE to drop the legacy code that has been around for over a decade if not more. Adoption of new coding practices and cloud benefits can be reaped.
By stating what I just stated, I am not siding with GE but the point that I am trying to illustrate is that this is a necessary evil. For GE, this is nothing but a business decision (my opinion) and in order for them to stay competitive in the market with the likes of AthenaHealth, NextGen, EPIC and all the other next generation EHR's.
As for Project Northstar itself, it is still in its infancy and the modular approach may not or will not work for many of the customer base for the same reasons that were previously mentioned. GE in the past two or three years has asked customers to add more and more and spend budget dollars that we do not have. As we all know all to well, GE likes to have customers oversubscribe on all fronts due to the inadequacy of their software. For some of you that know me, you know that I have been telling anyone that would listen that the code within the Client itself needs to be optimized, if they do that performance and crashing would stop.
For the topic of Interoperability, which is the other reason as stated by one of the Chugger's is focus of Project Northstar. The problem I see is that not all community healthcare organizations (Primary Care, Specialist and or Hospitals) have the means to communicate via the established standards of CCD/CCDA and then soon to be the new standard, FHIR.
It will be interesting to learn more about this project in Austin.
I agree. Northstar is a necessary step to leave behind the old, obsolete, and very poorly-written codebase and start fresh. At the same time it allows GE to integrate current technology such as HTML5, browser-based access, and a lot of other modern things.
My concern, and I'm sure others, is that GE will build and maintain Northstar like they do CPM/CPS/C-EMR. GE has had many years to fix a lot of issues, but it never seems to happen.
On the flip side, it takes a lot more time and resources to update and modify bad code vs good code. Northstar would (hopefully) start with a good codebase that is much more modern and manageable for all involved in the software development.
But then GE reminds us that epic fails go beyond the code. Although perhaps not quite measuring up to "epic" on the fail scale, disabling ICD10 codes obsoleted on 10/1/16 regardless of the date of service is just the most recent example. That's a fundamental failure at several levels--a failure that SOMEONE at GE had to know, or should have known, was going to happen. I find it incredibly unlikely that every team (database, code, management) at GE failed to predict that the validity of ICD10 codes would need to be based on date of service.
I haven't looked, but my guess is the database doesn't have "valid from" and "valid to" date columns for ICD10 codes, otherwise code changes based on date of service or the fix would have been very easy. Even if those columns would have been added in a recent update, the code may be so hard to work with it couldn't be coded in time.
If that's how the ICD10 code update played out, management probably went with the "it's easier to ask forgiveness than permission" strategy and buried the problem. Then when the issue was reported, they announce the "newly discovered" issue.
I could be way off. However GE's past behavior and lack of transparency with everything encourages such perceptions.
Northstar can't fix systemic issues like that, and I won't be at all surprised if it is not fixed on 10/1/17 either.
/rant
Amar, I know you went to the cloud a while back. Have you learned anything about how the GE implementation will affect you? Are you going to bounce from your cloud provider to the GE cloud or would GE have a presence in your current cloud? I ask this because as I prepare for the eventual transition to the GE cloud I would like to do it with SLAs in place or a private connection over fiber to the Cloud provider. There is one cloud provider here in Tallahassee but it makes no sense for me to move my infrastructure to the cloud then to access the GE cloud from there. I would prefer to have the "good" connection to the cloud instead of having to traverse multiple clouds with more points of failure.
Neither GE nor any other cloud vendor really care about how you get to their cloud, only that you sign on with them for a fee. In this case, we are being told that we must go to the cloud but the added expense of an SLA is not going to be offset by a discounted GE service and I will still need my infrastructure to support my other applications. I still have no idea if the GE cloud includes the Kryptiq DM, Portal, ESM, Biscom fax, etc. I expect that we are only talking about the jboss/SQL part which means we still will need a hefty infrastructure locally. Our doctors don't want downtime but we want to get a handle on costs. It looks like our costs will increase.
The SLA part is important to me, especially in what I learned during/after hurricane Hermine, which was only a weak Category One hurricane. Some of us were without power for a week. Cellular towers and some ISPs like Comcast only have 24 hours of reserve power for their infrastructure on all of their routers/switches. That meant no cellular or Internet 24 hours after the storm knocked down the lines which were not underground. My ISP is fantastic and they stayed up but it could have been affected by the local utilities in a stronger storm.
My advice to anyone reliant on their internet provider if you live in an area where you get Hurricanes or tornados, stay away from Comcast or any provider who uses utility poles for their cabling. Our local ILEC was down for over a week in some areas. The CLECs dependent on them were also down obviously. Choose your connection to the cloud wisely.
Mike Zavolas
Tallahassee Neurological Clinic
Mike, there are still varying degree of unknowns when it comes to Project Northstar (even after attending CHUG sessions). From what I learned, GE is approaching this as a piecemeal, where the first iteration of the Cloud feature will be the Orders Summary. What this means is that the Orders Summary component will run in GE's Cloud and the rest of the Application will still reside where it is currently. We are expected to see this new feature in late 2017.
As you stated, I am not sure what will happen to the other components such as SMPP, eSM, DM and other things as to will they me migrated or not - I would assume as some point it will all be in the GE cloud. My anticipation is that this is a 3-5 year journey for all of us.
The other thing about Project Northstar is that there will be two datasets for quite some time until the application is fully migrated to the Cloud. This is a scary thought, especially if the data gets out of sync for whatever reason and that is just one aspect.
Mike, I hope that I shed some light....I can certainly say this will be a hot topic for quite some time.