Quantcast
Channel: Jive Syndication Feed
Viewing all 11427 articles
Browse latest View live

Idoc to File with acknowledgement scenario

$
0
0

Introduction:
This article with cover the Idoc to file scenario with an acknowledgement back
to the sender ECC system using the Sender IDoc Adapter (AAE).


Requirement:  In this scenario, when the idoc is triggered
from SAP and posted in the target SFTP/FTP server, the acknowledgment of the
successful arrival of the message needs to be sent back to SAP without actually
designing another asynchronous scenario. This can be achieved by using the ALEAUD
Idoc which in turn updates the original idoc status. The objective is to
provide the configuration needed to receive the acknowledgment back to ECC.


Basically, in the sender Idoc Adapter
(AAE), we have the following acknowledgements;


System acknowledgment: sent back when the request arrives at the
final receiver.

 

System error acknowledgment: sent back when a system error occurs
during message processing within SAP XI.

 

Application acknowledgment: sent back when the message is
successfully processed within the receiver application.

 

Application error acknowledgment: sent back when an error occurs
during message processing within the receiver application.

 

ALEAUD Special Handling Needed

 

Choose this option to enable the sender Java IDoc adapter to apply special handling to the incoming ALEAUD
message. The adapter replaces the IDoc number of the ALEAUD message with the
original IDoc number.

 

If the target system is a non ECC system, we may get only System acknowledgments. If the target system is an ECC system, we can get System acknowledgments as well as Application acknowledgments & we can select all the five options above, although the target ECC system will have to run the program ‘RBDSTAT’ to send the pplication acknowledgments.

 

Steps:

  1. We have the idoc to file scenario working to trigger the idoc and place the XML in
    the SFTP server working as expected.
  2. To receive the acknowledgment back into SAP, ALEAUD idoc needs to be configured in
    SAP in WE20 to receive the acknowledgement in the form of ALEAUD idoc which in
    turn updates the original idoc status.

 

1.jpg

In PI,

3.  Configuration of IDoc sender channel: In the IDoc sender channel,
we need to provide the acknowledgment destination as an RFC destination which
points from PI to ECC.


2.png

  4. In the tab ‘Ack Settings’, we need to select the acknowledgment parameters:

 

3.png

 

5. Persistence parameter in NWA: The persistence parameter is to be set as ‘true’ in NWA in
order to receive IDoc acknowledgments. The path of this parameter is NWA->
Configuration-> Infrastructure->Application Resource->Java IDoc
Adapter->Properties:


4.png

Note for Persistence:

 

If you want the IDoc information to be persisted in the DB, set the value to true, else set the
value to false.

 

  • By setting the value to false, monitoring and acknowledgments cannot be monitored.
  • When the value is set to false there is no database operation and hence the IDocs
    are processed faster.
  • Message persistence is only done at the Messaging System for Adapter Engine. The
    persistence here refers to correlation information which is required for
    monitoring and acknowledgments handling. Hence setting the parameter is
    necessary to get acknowledgements.

 

6. Test by triggering the idoc from SAP and receive the updated idoc status with ALEAUD
idoc.

 

In SAPWe find an inbound ALEAUD idoc triggered to update the
original ORDERS idoc (status was 03) status to 41.

 

5.png
In PI, there would be one message for the original outbound message and one for the inbound ALEAUD acknowledgment message.


Mobile Social Commerce: Mobile-First, Social Interaction and a Marketplace

$
0
0

Online buying and selling is almost as old as the Internet itself.  eCommerce companies such as Amazon and eBay are true pioneers and their successes are undisputed.  Many other sites, dedicated to buying and selling have followed – focused on specific product categories and brands -- launch every day.  A significant number of them have associated mobile apps; although most were specifically designed for the desktop, with mobile as an afterthought.  Increasingly, eCommerce sites now utilize Responsive Web Design (RWD) – those that provide an optimal user interface experience, regardless of the display type from mobile to desktop. In virtually all cases, mobile was an add-on – in other words, the site, product, or brand, first builds their online presence and the mobile views are added later.

 

Mobile-first is a relatively new concept whereby the seller focuses first and foremost on the mobile experience as the primary user-interface for the eCommerce endeavor.   Mobile-first sites typically target mobile devices in their initial launches with some also utilizing responsive web design and some launching with a mobile app.

 

As of Q3 2013, social networking giant Facebook accounted for 874 million mobile Monthly Active Users (MAU) with 507 mobile Daily Active Users (DAU). These numbers do not include the 150 million Instagram users – the popular mobile-first, photo-based social network.  Over the past 15 years, Online Social networks have sprung up everywhere and it is a concept that works.  Whether it is an online forum, dedicated to a single hobby or an enormous, general community such as Facebook, almost everyone with some type of online presence has or is participated in an online social network. Some have done a good job of monetizing this social structure, while others have not.

 

But what if we take the concept of mobile-first , an online marketplace, and social network qualities and combine them into a Mobile Social Commerce site?  (maybe at this point, we coin the name: MoSoCo.)  Then we have a mobile commerce site, focused primarily on mobile devices, with a heavy social aspect.   At this point, you’d say, “well, on Amazon, you can leave feedback.”  True.  You can on Etsy as well; however, neither Amazon nor Etsy, or eBay for that matter, were mobile-first.  “Mobile” was added well after these were established online marketplaces.  Furthermore, a feedback mechanism is not really a social commentary in the truest sense.  Does such a company exist and is it successful?

Poshmark closet 1.jpg

 

I think so and it should serve as a blueprint for extending social network concepts into mobile-first marketplaces.  A Menlo Park company called Poshmark has achieved the best of Mobile Social Commerce.  Poshmark’s tag-line is “Poshmark is a fun and simple way to buy and sell fashion. Shop the closets of women across America – and sell yours too!”  From their initial launch in December 2011, they have always been primary focused on the mobile user – initially launching on iPhones, later supporting iPads.  They added Android support in October 2013.   Poshmark does support RWD, so desktop and users of another mobile OS are still welcomed.

 

According to the Poshmark Closet Sharing Economy Report, their users have bought and sold over 1.5 million items from each other. It is interesting to note, that Poshmark really promotes their mobile-first heritage.  In the Poshmark Closet Sharing Economy Report, they note: ”Mobile means the Closet Light is on Friday Nights: Smartphones make shopping accessible 24/7, thus leading to an earlier weekly purchase cycle than traditional brick and mortar retailers which see their highest activity on Saturdays.” While Poshmark doesn’t release its user counts, I estimate that there are around 2.2 million Poshmark users.  Other sources note that of that 2 or so million users, around 250,000 are active sellers – meaning the remaining are likely buyers only.  The women’s fashion industry is expected to exceed $621 billion in 2014, based on statistics from the research firm, MarketLine.  Clothing retailers are expected to contribute 65% to this global total.    

Marketplaces such as Poshmark focus on clothing resale and as they note:  “Poshmark is more than just another shopping destination. We are focused on offering a one-of-a-kind unique experience in connecting people and their closets.”  In Poshmark-speak, a “closet” is a seller’s online shop – but it is also literally clothes from a seller’s closet – something that she wants to resell, or perhaps something new that she bought but never wore (perhaps even “new with tags.”).  In reality, many Poshmark sellers offer not only their own items, but items from friends or relatives.  Variations of closets include brick-and-mortar retail boutiques and import wholesalers. But the key for Poshmark is that it is solely focused on women’s fashions and accessories,  Poshmark sellers have uploaded over $350 million of inventory and upload over a “Nordstrom’s store worth of inventory every 2 weeks.”

 

The social aspect of Poshmark is what is so amazing.  Here’s an interesting statistic:  One closet that I looked at had around 300+ items for sale.  Across all of these 300+ items were over 22,000 comments (an unscientific measure is around 73 comments per listing)!  Some items had close to 1000 comments, while others had only a few.  Digging deeper, you’ll find that these are detailed discussions among dozens of people.  Some comments are detailed discussions about the item; other threads covered every subject imaginable.  Additionally, some listings were not items for sale, but designated “chat rooms,” or listings sharing significant life events: a vacation, a new baby, a new kitten, or condolences.  Some were very funny; others were very touching.  The bottom line is that Poshmark, as a mobile-first marketplace, is also a very significant social destination.  The women of Poshmark have created an amazing community that transcends anything that has existed in a marketplace of any kind before – mobile or non-mobile.

 

Poshmark closet 3.jpeg

Poshmark closets also display how many followers they have as well as how many they follow.  There is also a concept of closing sharing.  Sellers share items in their closer to their followers, but they can also share other items in other sellers’ closets to their followers.  Everyone has a “feed,” which they can see these items.  This further adds to the social aspect of Poshmark and enables each seller that works at it to enable their items to get to more potential buyers.  Interestingly, many “Poshers” (as Poshmark users are called), spend hundreds of hours per month on Poshmark, listing items, sharing closets, and engaging in social conversations.  Many close friendships have been and are being forged through Poshmark alone.  Because it is mobile-first, all activities can easily be done through the Posher’s mobile device via the Poshmark app.

 

I’m pleased that Poshmark also enables Poshers to exchange information about an item via SMS – right from the app.  They also enable direct-uploads to other social media such as Twitter, Tumblr, Facebook, Instagram, and Pinterest, among others.  This further enables the Posher to promote their closet or item to others outside of the Poshmark community.  In fact, many Poshers that become friends, extend that friendship to include frequent texting (also an unscientific poll, but virtually all Poshers use standards-based SMS vs. a non-SMS-interworking OTT).

 

While the uniqueness of Poshmark makes it the clear leader in mobile-first, social commerce, there are few others that are trying to catch up.  A very similar marketplace called Threadflip is chasing Poshmark.  In their initial incarnation, they were more focused on desktop with mobile a mere afterthought.  However, as of the last 6 months or so, they’ve been moving more and more functionality to their mobile version, and they only recently added a social element (very much like Poshmark).  However, with Threadflip, there is no closet-sharing concept – one that can limit the social interaction as well as the ability to enable a seller to enhance the visibility of her items.  In Poshmark, sometimes the act of “closet sharing” serves as an introduction of two Poshers.   While Threadflip and Poshmark have some similarities, and, in a few areas, advantages over each other, overall, it seems that as long as Poshmark continues to evolve their experience (and they do still have some issues that their main competitor Threadflip tries to exploit:  such as non-fashion items being listed, offline transactions, boiler-place customer support), they can remain the leading Mobile Social Commerce marketplace for the foreseeable future.  This formula works and should be a blueprint for more to come.  Today, I believe mobile-first should rank as one of the top attributes of any online marketplace; however, enabling social interaction must also be considered as it can add a personal element to what has been (until now), a mostly impersonal buying experience.

 

Now if someone would just launch a MoSoCo marketplace for auto accessories, parts, and tools and where guys can share our garages!  GearheadMark, anybody?

 

Please follow me on Twitter: @wdudley2009

My First Tech Love... ♥ Online Dating ♥

$
0
0

Most of my friends  and work acquaintances consider me to be a “jock”. I however prefer to think of myself more as a "sportsman", "outdoorsman", or "endurance athlete". Sure, it is true that I can hop out of bed (even with a hangover) and run 100 miles through the mountains without breaking a sweat. But I didn't always have these abs of steel and bulging muscles. In fact, in school I was the quintessential geek. Back in high school, while the cool kids are drove to the mall for our hour-long lunch break, I skipped lunch and wrote computer programs in Basic on the school’s computer (I think it was from Radio Shack). A few years later, in college, those same popular kids spent their evenings doing vodka shots and passing in the front lawn at frat parties at 2:00 am, while I was holed up in the University computer lab typing commands in VAX/VMS.

 

TechLove.pngThis was pre-Windows, pre-Internet (at least as we know it now) and pre-Facebook. Yet, I uncovered something that would forever change my life and introduce me to the strange and exciting new world of online dating. Somehow during my late-night sessions of computer geeking, I stumbled upon the fact that it was possible to ping the server for a list of other logged on users and to communicate with these other users via a rudimentary and cryptic messaging protocol.  That’s how I met my first girlfriend. Let’s call her Missy – well because that was her name actually.

 

We hit it off online, eventually met and then dated for a while IRL (in real-life). It was a victory for geeks everywhere. Or at least a victory for geeks named John who lived on the University of Michigan’s North Campus – a  secluded “special campus” -- for engineers, math students, computer scientists, classical musicians and other assorted nerds -- located deep within a forest on the outskirts of town and only accessible via a lengthy bus ride from main campus, as if to quarantine and protect the rest of the campus from kids with overly high IQs and underdeveloped social skills.

 

That was 1995, and online dating (like the Internet itself) was still in its infancy, if it even existed at all. Today, over twenty-some years later, there’s no shortage of online dating Web sites and mobile applications. Anyone who has watched any late-night television has probably seen ads for sites like Match.com, eHarmony, or OKCupid that promise to help you find your soul mate based on your answers to a survey or personality test. And then there’s a whole slew of more specialized sites such as DatingforParents (for single parents) , BeautifulPeople (for people who are attractive and know it), Adam4Adam (for gay singles), OurTime (for singles over 50), JDate or Christian Mingle (for Christian singles) or even Farmers Only (for “good ol' country folk”). And surprisingly (or perhaps not surprisingly) there are even sites like AshleyMadison and NoStringsAttached for married people looking to do a little dating on the side. Oh my!

 

Many parallels can be drawn between the rapid advancement of online dating and the software industry itself, as well as the Customer Relationship Management (CRM) space specifically. Both began with humble beginnings and clunky interfaces that only a computer programmer (or other sufficiently tech-savvy Geek) could understand. And both slowly evolved, adding more features and functionality. Both eventually recognized the need for personalization as well as user-friend user interfaces. And, not surprisingly, in today’s era of smart phones and tablets, both now offer mobile apps that incorporate location-based services!

 

The device-security firm Lovation says 39% of online dating now happens through mobile apps. And, as with CRM scenarios, it’s all about location, location, location! Just as location-based CRM apps can be used, for example, to optimize the routes of service technicians based on their current location, so can online dating apps be used to help a hapless love-seeker to find a near-by soulmate just looking for love in the bar or coffee shop across the street! Location-based dating apps like Tinder and Grinder are all the rage. Tinder was recently in the news when several winter Olympic athletes confessed that they had to uninstall the app because it was interfering with their Olympic preparation and training. And at last year’s summer Olympics, so many athletes logged on at once that the Grinder server crashed, disappointing a lot of lonely would-be medalists (I bet their coaches were happy though).

 

I guess the moral of the story here is that, whether you are looking for love, looking for archived purchase orders, or looking for the nearest HVAC unit in need of repair… it’s going to be a lot easier and lot more efficient if you are leveraging the latest software applications and location-based services than if you are sitting in the back of a dark computer lab in the middle of the night randomly pinging the server hoping for an answer. Click here to learn more about SAP’s new powerful suite of mobile CRM apps called Fiori that can make your life easier... and leave you with more time to look for love. ♥ ♥ ♥

Is an integrated business process worth the pain?

$
0
0

So who can relate to the poor fellow below....

joke_1.png

I have encountered many such users that just want to take care of their particular task and get on with their day. As long as it's quick and painless for them they don't really care about up-stream or down-stream impacts, for the most part, that is. Of course there are always exceptions...

 

When it comes to working with an integrated business suite you sometimes have to take a little more time to conduct your day to day business. This is not to make your life complicated but rather to simplify the process as a whole. i.e. Other people in the process benefit greatly from the additional effort that you just put in. As an example, I am driving an EDI project at a client right now where they are receiving an 855 Purchase Order acknowledgment from their suppliers. These sometimes error out to workflow but the client doesn't see the need to rush to ensure they are posted. Well these 855 messages contain date changes, price changes, quantity changes and possibly material substitutions... If you don't post those against the PO before receiving the goods or the invoice they won't match! We could also miss customer expectations because of a delayed delivery date that we "weren't aware of". By not taking the time to post the PO confirmation we could affect the customer, the warehouse receiving group and accounts payable.

 

Last year it took a meeting of 10 people and about 20 man hours to "discuss" 1 PO confirmation that went wrong that held a price change... What is the cost of having to do that each time there is an error?

 

I was wondering if anyone else out there has a similar story where a little up-front effort could have saved a larger downstream cleanup effort...

What do you want in 2014?

$
0
0

We  know your time is valuable, and that is why we want to hear from you, on what you want in 2014!

 

Please take one-minute of your day and complete a quick customer  survey to gauge your opinion on targeted customer webinars for the coming  year. We would like to find where your focus will be and what is important to you in terms of data warehousing and analytics.

 

We will then use your response to analyze opinions about different  customer offerings in 2014. Click here to take the survey.

 

Thanks for helping us serve you better!

Run with Purpose Challenge: My Run with Purpose story

$
0
0

When Grace Chiu told me about the Run with Purpose Challenge, I thought why not? I’ve never written a blog before and there is nothing more rewarding than being presented a challenge and facing it head on.

 

Toronto1.jpg

Who am I?

 

My name is Carman Zhou and I’m from Toronto, Ontario, so yes that means that I get to experience temperatures that are -30 C with wind chill in the winter and +40 C with humidity in the summer. I’m a part of the HRBP team in Canada working in the Toronto office. My team is pretty virtual (Hey team!) but luckily, Telepresence gives us the opportunity to connect with each other. I studied HR at the University of Toronto but started at SAP working in the Industry and Solution Engineering team. When the HR internship came up in Toronto, I couldn’t help but be interested in it. Fast forward a couple months and here I am. Working alongside some of the most talented HR people I know and connecting with the HR community beyond the HRBP team in Canada.

 

My run with purpose story

 

I really liked the idea of the challenge of looking at something I did in the past and finding out how it helped me realize my passions. People used to ask me why I chose HR as a career, and my answer to them was always “I’m not quite sure, I guess I’m in Business and it’s available as a major”. Growing up, I was always good with numbers and I loved solving complex math questions. There was just something gratifying about being able to figure out the answers. So logically, everyone assumed I would have turned to Finance or Accounting, and when my friends found out I was streaming in HR, they were confused.

 

I once worked in a sales role; I had a quota that I managed and was constantly on the phone.  During that time, I lived for Fridays and dreaded Mondays. There was nothing worse than waking up and going into work. I knew I had to make a change and steer my career in a better path. I had a background working in IT and was interested in learning more because of the constant growth and change in the IT world. SAP gave me this opportunity to learn and explore my passions.

 

Working in HR at SAP, that’s when I realized that, you can love your work. It’s actually possible to wake up and want to be at the office and see your colleagues. I realized that there was a reason why I chose HR as career. I love working with people, helping them, and building strong relationships. I learned that Sales is not something I am passionate about and thanks to working in the sales role, I now know what I enjoy and can pursue a career in something I am passionate about.

 

There are a tremendous number of opportunities out there, each of which takes you to a different path and allows you to learn more about yourself. I am thankful for my learning experiences and the opportunity at SAP to find out what makes me run with purpose in my work life.

 

Join in on the challenge

 

To get this challenge started, I am inviting you to join in on the conversation. If you are interested in sharing your stories, please post a blog in the Career Center space here on SCN with “Run with Purpose Challenge” in your blog title and respond back to one of the questions below:

  • My purpose in my work is definitely not ___. I tried ___ out and after this experience I realized it really wasn't for me, but ___ is. (Give an example of something you tried that may not have worked out, or has helped redirect you to work that has been more enjoyable and rewarding for you.)
  • What inspires you the most about your work right now?
  • How do your passions contribute to your work?
  • How does having purpose/passions in your work allowed for career success or advancement?
  • How do you think your work create a positive impact for other people or the world?
  • What does it mean to run with purpose and how does it relate to the work you do?

 

Share a fun fact and photo of yourself along with your blog post, or a photo of your office or country if you’re camera-shy!

 

Don't forget to invite a friend or colleague to join in on the conversation!

 

Enjoy and have fun!

 

 

 

Want to find a job you can really be passionate about?

Browse all of our open positions at - http://jobs.sap.com/

Learn more about the workplace culture at SAP, see pics of our offices, talk to recruiters, and get real time job openings by connecting with us on our social pages:

http://facebook.com/LifeatSAP

http://twitter.com/LifeatSAP

FMs para sellado de CFD / CFDi en SAP ERP

$
0
0

Hola.

 

- En SD, el Function Module para hacer el sello es: IDMX_DI_SD_SINGLE_SIGN y este contiene el FM: IDMX_DI_SD_DATA_EXTRACT  que es el que forma la cadena original.

 

  

- En FI, el Function Module para hacer el sello es: IDMX_DI_FI_SINGLE_SIGN y este contiene el FM: IDMX_DI_FI_DATA_EXTRACT  que es el que forma la cadena original.

 

 

En ambos co-existen CFD y CFDI

Espero les sea de utilidad.

 

 

Saludos!

Hugo.

The Winter Olympics...Part 3: SAP Business Process Management – Go for Gold with Teamwork

$
0
0

Many of the astounding feats from the Winter Olympics are borne of intense personal drive, determination and fortitude. They are individual sports from a participant perspective, yet there is considerable teamwork that has gone into the preparation and training.

 

Ice hockey has superstars, yet it is undeniably a team sport.  It is fast paced, filled with athleticism, skill and coordination. When the puck moves quickly and accurately from stick to stick to back of net, well that elevates the teamwork to something more akin to poetry.

 

Teamwork is at the heart of our everyday existence, at home or work. Our efforts to continuously improve our business processes are not achieved by one person trying to move the mountain. Perhaps it is the right-winger who starts the play out of the defensive zone and coordinates an offensive rush.  Seldom does that individual successfully make a scoring play 200 feet down the ice without some assistance.  Likewise, the defensive play relies on communication and understanding each other’s role before it ultimately falls to the goalie to make a stunning save.  Our project and work teams are the same.

 

The project manager cannot be busy changing configuration or setting up new business scenarios without communicating to the rest of the team and allowing others to carry out their own unique roles on the line.  To study the game of hockey is really to study the characteristics of a well-tuned team activity.  Disciplined hard work gets results not dissimilar to what we must do on a weekly/daily/hourly basis to deliver successful results for our organizations.  Oh, and there is sometimes bloodshed.  Just saying.  Enjoy the games and watch out for flying pucks.


From The Archives: SELECT TOP and START AT

$
0
0

In this post, originally written by Glenn Paulley and posted to sybase.com in April of 2009, Glenn uses the TOP/START query syntax to demonstrate how an ORM can make database application performance suffer and difficult to debug/understand.


Lately my staff and I have spent some effort looking at SQL Anywhere's support for SELECT TOP N and SELECT TOP N START AT M. SQL Anywhere has supported Microsoft's SELECT TOP N syntax for several releases, and in Version 9 introduced support for the START AT variant to permit the specification of the starting offset for the result set.

 

These Microsoft extensions are not supported by the current ISO SQL standards, nor are they directly compatible with the result-limiting syntax supported by MySQL, which utilizes LIMIT and OFFSET syntax as part of the suffix of a query expression. In the interests of compatibility, we intend to support MySQL's LIMIT and OFFSET in SQL Anywhere in our next release (Update: LIMIT/OFFSET support was added in SQL Anywhere version 12). In the past two weeks I've been looking at re-implementing the NHibernate dialect for SQL Anywhere, currently called the "SybaseAnywhereDialect.cs" dialect in the NHibernate 2.0.1 distribution. As I expected, many of the issues with this dialect are identical to the issues with the Java Hibernate version, which I rewrote and packaged into a JAR file called SQLAnywhere10Dialect.java.


The reason for this post, however, is to point out the difficulty that application developers can have in diagnosing NHibernate issues, because not only the timing of SQL requests can differ from application behaviour, but also because the SQL that NHibernate generates is substantively different than one would anticipate, given the application. In particular, I'd like to point out how the Microsoft SQL Server 2005 dialect, "MSSQL2005Dialect.cs", handles the equivalent of the OFFSET clause. Because Microsoft SQL Server 2005 does not support OFFSET directly, the "MSSQL2005Dialect.cs" NHibernate dialect rewrites the query substantially using the ROW_NUMBER() window function. In a nutshell, what's happening is that the NHibernate dialect is parsing the SQL statement generated by NHibernate, modifying it as necessary to serve as a derived table, and subsequently wraps it with an outer block that utilizes the ROW_NUMBER() function to provide row-numbering semantics, whose result is restricted by an additional WHERE clause. Here's a code snippet that illustrates what's going on:

 

          ///          /// Add a LIMIT clause to the given SQLSELECT          ///          ///The to base the limit query off of.          ///Offset of the first row to be returned by the query (zero-based)          ///Maximum number of rows to be returned by the query          /// A new  with the LIMIT clause applied.          ///          /// The LIMIT SQL will look like          ///           ///          /// SELECT TOP last (columns) FROM (          /// SELECT ROW_NUMBER() OVER(ORDER BY __hibernate_sort_expr_1__ {sort direction 1} [, __hibernate_sort_expr_2__ {sort direction 2}, ...]) as row, (query.columns) FROM (          /// {original select query part}, {sort field 1} as __hibernate_sort_expr_1__ [, {sort field 2} as __hibernate_sort_expr_2__, ...]          /// {remainder of original query minus the order by clause}          /// ) query          /// ) page WHERE page.row > offset          ///          ///           ///           /// Note that we need to add explicitly specify the columns, because we need to be able to use them           /// in a paged subselect. NH-1155           ///          public override SqlString GetLimitString(SqlString querySqlString, int offset, int last)           {               int fromIndex = GetFromIndex(querySqlString);               SqlString select = querySqlString.Substring(0, fromIndex);               List columnsOrAliases;               Dictionary aliasToColumn;               ExtractColumnOrAliasNames(select, out columnsOrAliases, out aliasToColumn);               int orderIndex = querySqlString.LastIndexOfCaseInsensitive(" order by ");               SqlString from;               string[] sortExpressions;               if (orderIndex > 0)               {                     from = querySqlString.Substring(fromIndex, orderIndex - fromIndex).Trim();                     string orderBy = querySqlString.Substring(orderIndex).ToString().Trim();                     sortExpressions = orderBy.Substring(9).Split(',');               }               else               {                     from = querySqlString.Substring(fromIndex).Trim();                     // Use dummy sort to avoid errors                     sortExpressions = new string[] {"CURRENT_TIMESTAMP"};               }               SqlStringBuilder result =                     new SqlStringBuilder().Add("SELECT TOP ").Add(last.ToString()).Add(" ").Add(StringHelper.Join(", ", columnsOrAliases))                         .Add(" FROM (SELECT ROW_NUMBER() OVER(ORDER BY ");               AppendSortExpressions(columnsOrAliases, sortExpressions, result);               result.Add(") as row, ");               for (int i = 0; i < columnsOrAliases.Count; i++)               {                     result.Add("query.").Add(columnsOrAliases[i]);                     bool notLastColumn = i != columnsOrAliases.Count - 1;                     if (notLastColumn)                     {                         result.Add(", ");                     }               }               for (int i = 0; i < sortExpressions.Length; i++)               {                     string sortExpression = RemoveSortOrderDirection(sortExpressions[i]);                     if (!columnsOrAliases.Contains(sortExpression))                     {                         result.Add(", query.__hibernate_sort_expr_").Add(i.ToString()).Add("__");                     }               }               result.Add(" FROM (").Add(select);               for (int i = 0; i < sortExpressions.Length; i++)               {                     string sortExpression = RemoveSortOrderDirection(sortExpressions[i]);                     if (columnsOrAliases.Contains(sortExpression))                     {                         continue;                     }                     if (aliasToColumn.ContainsKey(sortExpression))                     {                         sortExpression = aliasToColumn[sortExpression];                     }                     result.Add(", ").Add(sortExpression).Add(" as __hibernate_sort_expr_").Add(i.ToString()).Add("__");               }               result.Add(" ").Add(from).Add(") query ) page WHERE page.row > ").Add(offset.ToString()).Add(" ORDER BY ");               AppendSortExpressions(columnsOrAliases, sortExpressions, result);               return result.ToSqlString();           }

 

In my view, Hibernate's (or NHibernate's) architecture is far from suitable for doing this type of complex manipulation. What would be arguably better is for NHibernate to expose the underlying representation of the intended query, so that a dialect could modify THAT abstraction, taking into account the context provided by the NHibernate mapping, prior to generating an SQL statement. This is the approach taken by Microsoft with its LINQ framework; DBMS vendors can implement their own optimizations on LINQ's canonical query trees prior to generating an SQL statement to be executed by the underlying database server. Semmle's .QL [1-3] takes this pre-optimization idea one step further, and provides built-in optimization of requests prior to the SQL generation step, a substantive advantage that Oege de Moor described at the 2008 ACM SIGMOD conference in Vancouver last summer.

 


[1] Oege de Moor, Damien Sereni, Mathieu Verbaere, Elnar Hajiyev, Pavel Avgustinov, Torbjorn Ekman, Neil Ongkingco, and Julian Tibble (2007). .QL: Object-Oriented Queries Made Easy. In Generative and Transformational Techniques in Software Engineering, LNCS, Springer-Verlag, 2008.
[2] Oege de Moor, Damien Sereni, Pavel Avgustinov and Mathieu Verbaere (June 2008). Type Inference for Datalog and its Application to Query Optimisation. Proceedings, ACM Principles of Database Systems, Vancouver, BC, pp. 291-300.
[3] Damien Sereni, Pavel Avgustinov and Oege de Moor (June 2008). Adding Magic to an Optimising Datalog Compiler. In Proceedings of the 2008 ACM SIGMOD Conference, Vancouver, BC, pp. 553-565.
[4] Pierre Henri Kuate, Tobin Harris, Christian Bauer and Gavin King (2009).  NHibernate in Action. Manning Publications, Greenwich, Connecticut. ISBN 978-1-932394-92-4.

SAP Hybrid Cloud: Proves Successful for Latin America E-Invoicing Compliance – Part 3 of 4

$
0
0

For the next installment on my series on: SAP Hybrid Cloud Proves Successful for Latin America E-Invoicing Compliance, I wanted to cover why you can’t rely solely on a pure managed service provider or 100% cloud provider for e-invoicing in Latin America. It is really important to understand there are 3 functional requirements to Latin America compliance. And 100% cloud or EDI Value Added Networks only cover the last component – Government connectivity.

You need to be aware of the ERP upgrade requirements, the process orchestration (i.e. signing, PDF creation, turn around attributes, extended data elements) even before you worry about the connectivity to the government.  After all, the ERP and process orchestration requirements create 80% of the cost components when implementing, monitoring, or maintaining compliance in Latin America.

 

Traditional Managed Service, EDI VANs, e-invoicing networks, and signing providers fail to provide an end to end service: Some companies turned to 100% pure cloud providers, but they found these EDI type VANs were not the best answer in the long run for two reasons:

 

  • Customized ERP creates support & maintenance issues
    • SAP – no two SAP systems are the same and getting an SAP system to work with the cloud provider’s standards is 80% of the implementation and change management headache.  While the government requirement is standardized – connecting a company to the government system is not.  Non-standard integration scenarios stem from companies having their own internal processes and more importantly end customer requests. Cloud and EDI providers typically run from non-standardization and force the ERP extraction portion of the implementation and maintenance to the end user.  So in the end, what value was really provided?  And yes, you just created a monitoring issue, a support situation where people will point fingers at each other for a failed invoice, and a change management and testing nightmare as you now have two parties involved.

 

  • Shipping is Affected
    • In Brazil, if you don’t have your signed DANFe on the truck, you can’t ship.  If you rely on a 100% cloud provider and the network is down, your internet is down – you can’t ship.  Brazil offers a model called “Contingency” whereby you can print a special piece of paper and as long as you have power to your printer, you can still ship.  Solutions will then automatically reconcile those nota fiscals when the network comes back online.  A 100% cloud solution cannot provide on premise contingency.

 

 

So the Hybrid Architecture is the strategy winning the day, and in our next article we will explore why in more detail.

SAP Serialized Track and Trace in the Pharma industry - Seminar on 12th Feb

$
0
0

Q Data USA is hosting a seminar on 12th March to cover the important topic of how SAP solutions play a role in providing answers to the serialization issue that faces the pharma industry today. We have 8 sessions by 6 thought leaders giving their thoughts on DQSA, OER / AII, GBT, Implementation and Integration on the production line...

Download Seminar Flyer | Abstracts

Eventbrite - 1st Annual Serialized Track and Trace Seminar

Date: 12th March – 8 sessions starting 8am PST and ending at 3pm PST

 

Congress enacted the Drug Quality and Security Act of 2013 (DQSA), formerly referred to as HR 3204, in November 2013 and it starts to take effect in a significant way in January 2015. You should be planning to attend the 1st annual Serialized Track-and-Trace Seminar on 12th March, 2014. It’s a virtual seminar that allows you to attend the sessions from the comfort of your own home or office, yet you still have access to all the content and experts that you would expect from a live event. It’s a truly unique and valuable opportunity for those focused on these latest changes in legislation for the Pharma industry.

Pills

With this latest passing of legislation, it’s more critical than ever for you to gain a true understanding of the implications to your business understanding the time-frames for compliance and working through the intersection of the law with SAP based solutions.

Listen to the industry experts providing for an information-packed seminar. Presenters will explore DQSA and what it means to the Pharmaceutical Supply Chain players. Hear what solutions SAP and partners are providing to distributors and manufacturers to successfully implement serialized track-and-trace technology to further secure the safety and security of the healthcare supply chain.

This seminar also offers a unique opportunity to hear from leading solution providers and solution implementers.

This year’s seminar topics are split in to 2 tracks:

TRACK 1: An executive focus

Understanding the business impact and overall solutions available to you

  • What does the Drug Quality and Security Act of 2013 (DQSA), formerly referred to as HR 3204 mean to you?
  • What strategies and solutions should you be considering for Supply Chain efficiencies and regulatory compliance?
  • Solutions to help with Supply Chain efficiencies for serialized items
  • Implementation considerations for global serialization

TRACK 2: A Solution focus

This track focuses on serialized Track and Trace solutions provided by SAP and Inxites

  • On overview of SAP’s Track and Trace Solution with a focus on SAP EM
  • Looking in depth at SAP OER and SAP AII
  • Extending SAP’s Serialized Track and Trace solution with Verifier Suite
  • A detailed look at SAP Global Batch Traceability

 

Target Audience

This seminar is a must-attend for pharmaceutical distributors, manufacturers and retailers seeking to explore implementation strategies, understand the DQSA legislation and its impact on the healthcare supply chain. If serialization, batch traceability and regulatory compliance is of importance to you then don’t miss out on this great opportunity.

 

Presented by Q Data USA

Q Data USA

Supported by TBMG, Inxites, SAP, Pharma Logic Solutions and Auto-ID Consulting LLC

Pharma Logic Solutions
Inxites


Agenda

Download Seminar Flyer | Abstracts

Track 1 – Executive Focus

Session

Time (PST)

Session Title Speaker
1

8am
to
9am

What does the Drug Quality and Security Act of 2013 (DQSA), formerly referred to as HR 3204 mean to you? On November 27, 2013 H.R. 3204 was signed into US Federal law as the Drug Quality and Security Act of 2013 (DQSA). Among other things, the new law will impact the way prescription drug trades are documented in the United States and preempt state laws for drug traceability, including California’s e-pedigree law.  This one hour session outlines the requirements of the DQSA, Title II, referred to as the Drug Supply Chain Security Act (DSCSA).Bill Fletcher
2

9:30am
to
10:30am

What strategies and solutions should you be considering for Supply Chain efficiencies and regulatory compliance? Many countries around the world either have existing or pending laws and regulations to track the movement of prescription drugs in an effort to improve the integrity of their drug supply chain.  Life sciences companies are preparing new systems to comply with serialization and traceability. This one hour session will outline the requirements and suggest methods to comply and to gain efficiencies and benefits from traceability.Bill Fletcher
3

12pm
to
1pm

Solutions to help with Supply Chain efficiencies for serialized items SAP has been actively leading the drive towards providing comprehensive serialization solutions dating back to 1999 when they were the first software company to join and collaborate with industry visionaries at MIT’s Auto-ID Center.In recent years, as counterfeit detection and supply chain integrity have climbed to the top of the agenda, SAP is once again leading the way in integrating serialization for track & trace with traditional “back end” ERP processes, evidenced by the fact that the majority of the world’s major pharmaceutical companies are deploying or planning to deploy SAP Auto-ID Infrastructure (AII) and SAP Object Event Repository (OER).This session will outline the business drivers for serialization and cover the SAP solution functionality and the standards that they support. It will also touch on Batch traceability in addition to serial number traceability.Stephen Cloughley and Kevin Wilson
4

1:30pm
to
2:30pm

Implementation considerations for global serialization Implementation of a serialization solution requires a holistic approach and broad view on how the data will be provisioned, captured, managed and disseminated. A phased-based, multidisciplinary and cross functional approach is required to deliver successful results. Building in adaptability and extensibility is key to meeting evolving requirements.This one hour session will explore the many challenges facing serialization projects and key issues and approaches to consider.James Tucker

2:30pm
to
3:00pm

Wrap-Up / open forum – What are your next steps? In this quick wrap-up session we will pull together any open questions and also give an outline of a typical road-map that TBMG follows towards helping interested companies make an informed decision around if and how to implement a serialization solution.Martin Rowan

Track 2 System Focus

Session

Time (PST)

Session Title Speaker
1

8am
to
9am

An overview of SAP’s Track and Trace Solution with a focus on SAP EM SAP’s Track and Trace solution comprises of the following components:
  • SAP Object Event Repository (OER)
  • SAP Auto-ID Infrastructure (AII)
  • SAP Event Management (EM)
  • SAP Global Batch Traceability (GBT)

In this session we will cover an overview of each of the components and do a deep dive in to SAP Event Management functionality showing how it can be applied to your business process. We will also dive in to a demo system to show some hands on views of SAP Event Management screens

Kevin Wilson
2

9:30am
to
10:30am

Looking in depth at SAP OER and SAP AII These are the 2 main SAP modules providing Track and Trace functionality for serialized items. In this session we will run through the standards that are applicable when talking serialization and in particular Pharma speak. We will also cover the needed architecture together with the scope and functionality covered by each module. After attending this session you’ll have a clear understanding as to what the intended use and scope is for SAP OER and AII.Kevin Wilson
3

12pm
to
1pm

Extending SAP’s Serialized Track and Trace solution with Verifier Suite A serialization project has many nuances to it making it an extremely tricky and complex implementation to handle. In this session we will highlight these complexities and share details on common roadblocks that are encountered. We will also introduce the Verifier Suite of components, developed by Inxites and certified by SAP, that helps bridge the gap for Pharma companies between what is required to gain supply chain efficiencies in your serialized process and what is provided standard within SAP. A typical serialization project approach will be outlined showing what components are needed to be addressed.Guido Rijcken
4

1:30pm
to
2:30pm

A detailed look at SAP Global Batch Traceability The need is out there to be able to understand the reach of a bad batch that is discovered in your supply chain. How can we quickly search the batch genealogy top-down and bottom-up? Can we take those results and see the distribution record and report it to authorities? Can we use that information to affect a targeted recall and limit our exposure?In this session we’ll take an in-depth look at SAP’s solution that covers these questions. You’ll get a detailed look in to what SAP GBT is and what is needed to implement the solution.Stephen Cloughley and Kevin Wilson

2:30pm
to
3:00pm

Wrap-Up / open forum – What are your next steps? In this quick wrap-up session we will pull together any open questions and also give an outline of a typical road-map that Q Data follows towards helping interested companies make an informed decision around if and how to implement a serialization solution.Kevin Wilson


Speaker Bios

Download Seminar Flyer | Abstracts

Bill Fletcher1 - Bill FletcherManaging Partner
PHARMA LOGIC SOLUTIONS, LLC
Email:
bfletcher@pharma-logic.comMr. William (Bill) Fletcher’s background spans over 30 years in pharmaceutical, enterprise software and healthcare systems.  In December 2013, he completed his 21st commercial serialization and traceability project for global life sciences companies, including design, strategy and requirements.  He has been recently helping companies implement projects to comply with serialization and tracking regulations around the world, including the recent US Federal Drug Quality and Security Act of 2013 (DQSA), China, EU, Turkey, South Korea, Brazil and others. Mr. Fletcher has received various industry certifications, including SAP Auto identification infrastructure (Aii) version 7.1 and the GS1 Certified Professional (certifying a detailed understanding of barcoding and serialization standards).
Download Bill’s full bio here.
Stephen Cloughley2 - Stephen CloughleySenior Director
SAP Labs LLC
Email:
stephen.cloughley@sap.comMr. Cloughley is a Senior Director at SAP Labs LLC, responsible for Supply Network Traceability.  Stephen has responsibility for SAP’s supply network traceability solution as part of its sustainability program, which has SAP Event Management as a key technical component.He came to SAP through the acquisition in 2005 of Lighthammer, where he was part of the leadership team responsible for business development.  Prior to Lighthammer, Stephen was President and CEO of Base Ten Systems, Inc., a leader in Manufacturing Execution Systems (MES) for the FDA-regulated industries.  Stephen is a chemical engineer from University College Dublin and has over 20 years’ experience in the software industry in Europe, South Africa and the United States.
Kevin Wilson3 - Kevin WilsonES+ Practice Lead and SAP Solution Engineer
Q Data USA, Inc.
Email:
kwilson@qdatausa.comMr. Wilson has more than 24 years’ experience in business analysis, solution architecture, project management, solution development & implementation. 18 years of SAP experience covering supply chain, retail, utilities & manufacturing, amongst others. Business case and value engineering expertise covering multiple, integrated SAP solutions with a strategic focus on Supply Chain Management. Author and thought leader in SAP supply chain execution processes. Download Kevin’s full bio here.
James Tucker4 - James TuckerSAP Serialization and Traceability Expert
Auto-ID Consulting LLC
Email:
jt@aieguru.comMr. Tucker has been referred to as the AIE Guru for his expertise with SAP serialization and traceability solutions, He has seen serialization evolve from its roots in early RFID pilots to today leading global full life cycle implementations leveraging the latest in SAP solutions and best practices.Over the past several years, Mr. Tucker has led major pharmaceutical clients from initial strategic planning through blueprinting and onwards to production implementations. Through this experience, James provides a holistic view to the challenges faced when implementing a serialization program. His broad background in IT strategy and operations, client delivery as well as developing solutions leveraging emerging technologies enables him to provide a cross functional understanding of the obstacles faced when trying to adapt to an ever changing landscape of regulations and requirements.
Martin Rowan5 - Martin RowanManaging Partner
The Business Maturity Group
Email:
mrowan@tbmginc.comMr. Rowan’s primary focus is to facilitate solutions that help companies optimize their Integrated Supply Chains to achieve a level of Business Maturity and increase profitability.  He is actively involved with local and international ERP markets and has been for over 20 years.  His experience covers a wide variety of industries across multiple countries.  Some clients he has represented across the globe are BMW, Toyota, General Motors, Mittal Steel, Sanyo, Insight and Asics.  Martin holds a Bachelor’s Degree in Engineering and a Master’s degree in Commerce.
Guido Rijcken6 - Guido RijckenManaging Director
Inxites Americas, Inc.
Email:
guido.rijcken@InXites.beMr. Rijcken is one of the “early days serialization” consultants and has been involved in many serialization projects in many different industries. He has 30 years of ICT/ERP experience focusing on Functional Consultancy, Project Management, Solution architecture, Shop Floor Integration and Execution for Mass Serialization projects.

Download Seminar Flyer | Abstracts

Eventbrite - 1st Annual Serialized Track and Trace Seminar

How to add a D3 extension for SAP Lumira

$
0
0

Last time out we created the Hello World extension for SAP Lumira. If you have not created an SAP Lumira extension before I'd suggest reading that post first.

 

This time I will walk you through how to bring in a D3 chart as extension for SAP Lumira, I will be using the D3 Bullet Chart as an example but the steps should be similar for other charts.

 

You will need to be comfortable with JavaScript and D3 to follow this guide as this is a more advanced example.

 

 

Step by step guide

 

Note: The vizPacker in Lumira 1.15 works with SVG as the chart container only. If your chart extension uses a DIV container it will not work.

 

Prerequisites

You need to install:

  1. SAP Lumira 1.15 (which includes an updated vizPacker that fixes the Failed to create chart: TypeError: Cannot read property '0' of undefined).
  2. Google Chrome as vizPacker only works with Google Chrome
  3. I'd suggest you watch the video first as there are a lot of steps involved in this...

 

Prepare the D3 code for re-use

To save time I have already made these changes to a local copy of the D3 Bullet Chart code (look for the // MDL comments), you can grab my updated version here.

 

The Bullet Chart and vizPacker use a simple JSON data structure, think of it being a JSON version of a CSV file.

 

 

[

  {"title":"Revenue","subtitle":"US$, in thousands","ranges":[150,225,300],"measures":[220,270],"markers":[250]},

  {"title":"Profit","subtitle":"%","ranges":[20,25,30],"measures":[21,23],"markers":[26]},

  {"title":"Order Size","subtitle":"US$, average","ranges":[350,500,600],"measures":[100,320],"markers":[550]},

  {"title":"New Customers","subtitle":"count","ranges":[1400,2000,2500],"measures":[1000,1650],"markers":[2100]},

  {"title":"Satisfaction","subtitle":"out of 5","ranges":[3.5,4.25,5],"measures":[3.2,4.7],"markers":[4.4]}

]

 

If you need a different data structure in your extension (like a parent/child hierarchy) you will need to map from the simple JSON format to what you need.

 

At a high level here are the changes I made:

 

    1. Save a local copy of the bullet chart index.html, bullet.js and bullets.json files from: Bullet Charts
    2. Update index.html:
      • We are going to mimic the vizPacker behavior
    • Add an SVG tag as the container for the extension into the page body and set the id to vis

      <svg id="vis"></svg>

       

      <script src="http://d3js.org/d3.v3.min.js"></script>

    • Add the vis variable and assign it from the vis SVG tag in the top of the script - this is the variable the vizPacker render code uses:

      <script>

       

      var vis = d3.select("#vis");

    • Hard code the test JSON data in a variable called fdata - this is what the vizPacker render function does by default
      • See the following optional step
    • Optional step - change the fdata JSON property names to make them more user friendly when they are shown in Lumira:
      • Combine title and subtitle into one property called Titles
      • Change measures to Actuals
      • Change markers to Target
      • Change ranges to Ranges
      • Update the bullet chart code to use the user friendly property names:

        var vis = d3.select("#vis");

         

        // MDL: embed JSON so we can test with Google Chrome locally.

        // MDL: renamed "data" to "fdata".

        // MDL: Combined "title" and "subtitle" as "Titles" array.

        // MDL: Renamed ranges to "Ranges".

        // MDL: Renamed measures to "Actuals".

        // MDL: Renamed markers to "Target".

        var fdata = [

              {"Titles":["Revenue","US$, in thousands"],"Ranges":[150,225,300],"Actuals":[220,270],"Target":[250]},

              {"Titles":["Profit","%"],"Ranges":[20,25,30],"Actuals":[21,23],"Target":[26]},

              {"Titles":["Order Size","US$, average"],"Ranges":[350,500,600],"Actuals":[100,320],"Target":[550]},

              {"Titles":["New Customers","count"],"Ranges":[1400,2000,2500],"Actuals":[1000,1650],"Target":[2100]},

              {"Titles":["Satisfaction","out of 5"],"Ranges":[3.5,4.25,5],"Actuals":[3.2,4.7],"Target":[4.4]}

            ];

        // MDL: end

    • Add a y position attribute that positions each bullet chart in the vis SVG tag
      • Note: The original bullet chart relied on a DIV container so it did not need the y position, it would automatically put the next bullet chart onto a new line in the web page, that does not happen when you are using SVG so you need to add the y position to get the same look
        y position.png
  • Remove the randomize button and code as that is not needed in Lumira

  • Update bullet.js:
    • Add code so that the bullet chart can still draw when there are no Titles, Actuals, Ranges, or Target passed to it as that is what Lumira will do when you first create the chart
      • Calculate the max range even if there is no datadraw when no data.png
  • Test the updated index.html file in Google Chrome:
    • To make sure it runs and visually looks the same as the original did
  •  

    Create the D3 extension

    Follow these steps to create and test your D3 Bullet Chart extension inside the vizPacker:

    1. Using Google Chrome
      Open: <installdir>\SAP Lumira\Desktop\utilities\vizPacker\vizPacker.html
    2. The LAYOUT DESIGN tab is where you define the ID, Name and Layout of your extension as well as edit the code
    3. Using the LAYOUT DESIGN tab, set the ID and name of your extension:
      • Click the outer gray box to display the chart visualization properties:
        Edit extension ID - outer gray box.png
      • About IDs:
        • The ID naming convention is similar to making unique Java class names
        • You use the reverse of your company web address and then add the name of the extension
          For example: com.sap.sample.d3.bulletchart
        • IMPORTANT note for SAP Lumira 1.15:
          • The ID must be all lowercase otherwise it will not work inside SAP Lumira
          • To make it easier to package, install and remove extensions I recommend removing the dots in the ID.
            For example: comsapsampled3bulletchart
      • Settings:
        • ID:       comsapsampled3bulletchart
        • Name: Bullet Chart
      • Note: As you close the properties window or move between fields the vizPacker updates the ID and Name used in the code editor
    4. Using the LAYOUT DESIGN tab, set the ID and name of your extension plot area:
      • Click the inner gray box to display the chart plot are properties:
        Edit plot area ID - inner gray box.png
      • About IDs:
        • Follow the same rules as for the extension ID, but append module to the end
      • Settings:
        • ID:       comsapsampled3bulletchartmodule
        • Name: Bullet Chart Module
      • Note: As you close the properties window or move between fields the vizPacker updates the ID and Name used in the code editor
    5. Remove the legend as the Bullet Chart does not need a legend
      • Click the X at the top of Legend:
        remove legend.png
    6. Import test data for the Bullet Chart:
      • Create a CSV version of the JSON Bullet Chart data, here is the CSV file I made for the Bullet Chart
      • Switch to the DATA MODEL tab
      • Click to select and upload the CSV file:
      • Click to upload a CSV file.png
      • After you have chosen your CSV file.
      • Click OK when you see a JavaScript alert as you import the CSV file:
      • The data model now shows the imported CSV data:
        Imported CSV data.png
      • Note: There are 8 dimensions (one for each column) and 0 measures
    7. Next we will create the JSON properties (as they will be used in Lumira) and map test data columns to them.
      • Columns Title and Subtile ===> Titles
      • Columns Target ===> Target
      • Columns Range 1, Range 2 and Range 3 ===> Ranges
    8. Measure and dimension groups (another name for arrays in Lumira terminology):
      • Measure groups (arrays) are for the numeric values in Lumira
        • So for the Bullet Chart these are the Actuals, Target and Ranges
      • Dimension groups (arrays) are for everything else:
        • So for the Bullet Chart these are the Titles
    9. Create the measure and dimension groups to match the JSON format that we need:
      • Create Titles as the first dimension group:
        • Click the drop down arrow next to Title
        • Then the right arrow (if expanded will show a drop down arrow as below) for the dimension
          titles dimension group.png
        • Double click the name Entity (the name the first radio button) and change the name to Titles, then press ENTER
          titles dimension as first group.png
        • And select the first radio button, so the Title column is now mapped into the dimension group called Titles

      • Map the Subtitle column to the Titles dimension group as well

      • Create Actuals as the first measure group:
        • Click the drop down arrow next to Actual
        • Then the right arrow (if expanded will show a drop down arrow as below) for the measure
          actuals measure group.png
        • Double click the name Sales Data (the name of the first measure radio button) and change the name to Actuals, then press ENTER
          actuals measures first group.png
        • And select the first radio button, so the Actual column is now mapped into the measure group called Actuals
      • Map Pace into the Actuals measure group
      • Map Target into measure group 2 and call it Target
      • Map Range 1, Range 2 and Range 3 into measure group 3 and call it Ranges
      • Click to Apply the Data Model:
        Apply data model.png
        The data model, structure and test data are now applied in the vizPacker

    10. Prepare the render function:
      • The only lines we really need from the template render function are for the fdata property which generates the JSON structure that we need
      • So update the render function so it looks like this:
        render with fdata only.png
    11. Add the the code from bullets.js:
      • The code is wrapped in an anonymous function that we do not need
      • We do not want that anonymous function, so copy all the code except the first and last line in the file (note they have a red strike through them in these images):
        copy bullets.js code part 1.png
        Copy down to:
        copy bullets.js code part 2.png
      • And paste the code you just copied into the bottom of the vizPcker just before the close of the last function:
        Paste bullets,js code here.png
      • Change the bullet chart from a global namespace (as that is bad practice for extensions - d3.bullet) to a local variable (var d3_bullet):
        • Look for the line that says:
          d3.bullet = function() {
        • And change it to a local variable like this:
          var d3_bullet = function() {
        • We can now refer to d3_bullet when we need to create the bullet chart later on.

    12. Add in the code from index.html to create the bullet chart:
      • Copy all of the script code after the fdata line because we already have the fdata generated in our render function:
        copy index code part 1.png
        Copy down to:
        copy index code part 2.png
      • And paste the code you just copied into the bottom of the render function:
        Paste index code here.png
      • Change from the d3.bullet call to d3_bullet (the local variable):
        • Look for the line that says:
          var chart = d3.bullet
        • And change the dot to an underscore like this:
          var chart = d3_bullet
      • That is most of what we need to do because we now have the bullet chart code and rendering code in.
        What we are missing are the CSS styles, we will do that soon.

    13. Change the Bullet Chart so it adjusts to the width variable passed in to the render function:
      • Look for the line:
        var bulletWidth = 960;
      • And change it to:
        var bulletWidth = width;

    14. Add in the CSS styles from index.html to style the bullet chart:
      • The best way to do this is to add the CSS styles inline as you need them inside the Bullet Chart code
      • For now I will save time and inject the class level CSS styles into the page
        • This has the down side that when you save the chart in Lumira the styles will not be used when generating the thumbnail
      • Copy the CSS bullet styles:

        .bullet { font: 10px sans-serif; }
        .bullet .marker { stroke: #000; stroke-width: 2px; }
        .bullet .tick line { stroke: #666; stroke-width: .5px; }
        .bullet .range.s0 { fill: #eee; }
        .bullet .range.s1 { fill: #ddd; }
        .bullet .range.s2 { fill: #ccc; }
        .bullet .measure.s0 { fill: lightsteelblue; }
        .bullet .measure.s1 { fill: steelblue; }
        .bullet .title { font-size: 14px; font-weight: bold; }
        .bullet .subtitle { fill: #999; }
      • Paste the CSS bullet styles into the vizPacker above the render function:
        paste in CSS styles.png

      • Convert the CSS styles into a single string and inject them into the page using jQuery (the changes are in bold, and I added the !important declaration to stop Lumira from overriding the font style - I did not do that in the video I recorded):
      • var pageStyles = "\

        .bullet { font: 10px sans-serif !important; }\
        .bullet .marker { stroke: #000; stroke-width: 2px; }\
        .bullet .tick line { stroke: #666; stroke-width: .5px; }\
        .bullet .range.s0 { fill: #eee; }\
        .bullet .range.s1 { fill: #ddd; }\
        .bullet .range.s2 { fill: #ccc; }\
        .bullet .measure.s0 { fill: lightsteelblue; }\
        .bullet .measure.s1 { fill: steelblue; }\
        .bullet .title { font-size: 14px; font-weight: bold; }\
        .bullet .subtitle { fill: #999; }\

        ";

        $("head").append("<style>" + pageStyles + "</style>");

    15. Validate your code inside the vizPacker:
      • Click on Run Code
        Run code.png
      • If you get a JavaScript error message then there is something wrong with your code:
        • Close the JavaScript error message
        • Right-click on code editor and select Inspect Element to enable the Google Chrome developer area
        • Then go to the Console tab and look at the error messages
        • Fix your code in the vizPacker code editor and then click Run Code again

    16. Once your code is validated, preview your extension inside the vizPacker:
      • Click RUN CODE
      • Click to turn on preview mode:
        Turn on preview.png
      • The preview window will appear inside the code editor and you should see your Bullet Chart:
        testing in vizPacker.png
    17. Next you are ready to pack(age) and test inside SAP Lumira

     

    Pack and install your extension into SAP Lumira

    1. Click Pack
    2. The extension will be packed into a ZIP file. Click the ZIP file link:
      pack.png
    3. Google Chrome will download the packed ZIP file
    4. Select the downloaded ZIP file and select Show in folder
    5. Extract out the the packed extension and install it into SAP Lumira:
      • Extract out all the files and folders from the ZIP file
      • Copy the bundles/comsapsampled3bulletchart folder to <installdir>\SAP Lumira\Desktop\extensions\bundles

        For example: <installdir>\SAP Lumira\Desktop\extensions\bundles\comsapsampled3bulletchart
    6. Next we can test the extension inside SAP Lumira

     

    Test your D3 chart inside SAP Lumira

    1. Start SAP Lumira
    2. Import the bullet chart test CSV file from here
    3. Select the Bullet Chart extension:
      Select Bullet Chart extension.png
    4. Use these settings with the chart:
      TitlesTitle, Subtitle
      ActualsActual, Pace
      TargetTarget
      RangesRange 1, Range 2, Range 3
    5. The Bullet Chart should now look like this when running in Lumira:
      Running in Lumira.png

      Note: If this does not work, you will need to refer to the SAP Lumira SDK Getting Started Guide on how to debug within SAP Lumira

    6. That's it, you have created, packaged and tested your first D3 extension using the vizPacker!
    7. Congratulations and happy coding...

     

    More information

     

    Have a look at the:

     

    Resources

    Oracle Database 11g R2 PSR 11.2.0.4のSAP提供開始

    $
    0
    0

    Oracle Database 11g Release 2 Patch Set 3、つまりPSR 11.2.0.4がようやくSAPでも利用可能となりました。

    SAP以外の一般的なOracle Databaseとしては夏から秋にかけて提供されていましたので

    およそ半年の遅れでSAPサポートということになります。

    ORCL0000.JPG

    通常のOracle DatabaseだけでなくOracle Exadataでも同時にパッチセットが提供されています。

    OSプラットフォームは今のところUNIX(AIX, HP-UX, Solaris)、LinuxのみでWindowsは3月とのこと。

    Windows Server 2012はOracle 11.2.0.4のみサポートのためもう少し待つ必要があります。

    SAP on Windows/Oracleユーザーはサーバー更新先プラットフォームにWindows Server 2008しか

    選択できない状況だったのでようやくといったところですね。Oracle Database Applianceも3月のようです。

     

    PSR 11.2.0.4は最新かつターミナル(最終版の)パッチで多くの不具合修正がされており、

    安定したバージョンとのことですので既存ユーザーは適用を検討してください。

    なお、Oracle Database 11.2のサポート期間は標準サポートが2015年1月末、

    拡張サポートが2018年1月末(ただし初年度2016年は追加費用なし)となっていますので、

    今年SAPサポート開始予定のOracle Database 12cへの移行も見据え計画を立てる必要があります。

     

    詳細は以下SAP Noteをご参照ください。

     

    インストールメディアおよびパッチはSAP Software Download Center(http://service.sap.com/oracle)で提供されます。

    残念ながら現時点ではインストールメディアはまだ準備中のようです。該当SAP Noteには2、3日中とあるので

    来週には公開されているのではないでしょうか。SAP Bundle Patchだけは2014年2月版が既に提供されていました。

    ORCL0001.JPG

    インストールメディアが公開されたら時間を見つけてデータベースアップグレードの手順を確認したいと思います。

    Share your RS architecture

    $
    0
    0

    RS is a very powerful and interesting products.

    You can construct your environment whatever you like.

    You can move your data to wherever you want.

    Here are some architecture I've seen.

     

    1. 40+ ASE and 1 RS

    zz.JPG

    New SAP Analytics blog post released


    Custom Screen in Transaction Manager, Facility

    $
    0
    0

    Custom Screen in Transaction Manager - TM_61/ 62/63

     

    In this Blog, lets see how to enhance a the transaction manager screen using a BADI. Basically adding a new custom tab.

     

    Transaction manager is TM_61.

     

    The basic table behind this transaction is VTBFHA.

     

    We need to create the Append structure for this table with the needed fields that you think must be there in the screen,

     

    append.JPG

    Now comes the tricky part. Which BADI to use and how to implement the logic.

    The BADI here to be used is 'FTR_CUSTOMER_EXTENT'.

     

    Create a BADi implementation for this BADI.

     

    BADI.JPG

    Next step is that we have to create a function Group.

     

    Now we need 2 things.

    1. Screen where we will have the new fields must be created in this Function group.

    Note: The screen must be created as a sub-screen.

     

    screen.JPG

     

     

    scrren1.JPG

     

    scrren2.JPG

    Write Logic in PBO & PAI accordingly.

     

    PBO1.JPG

     

    Now comes one more interesting aspect of how you need to handle this custom fields in your PBO.

     

    We need to get the custom data fields using the 'GET_CUST_DATA' method and then do further logics.

     

     

     

    FORM 100_PREPARE_OUTPUT  CHANGING P_VTMFHA.

      DATA: L_FHA_APPENDS LIKE LINE OF G_TAB_FHA_APPENDS.

      DATA: LD_TEXT1 TYPE T005T-LANDX,

            LD_NAME1 TYPE ZZNAME,

            LD_NAME2 TYPE T880-NAME1.

    * try to get append data from VTBFHA

      CALL METHOD G_PROXY_CUST_DATA->GET_CUST_DATA

        IMPORTING

          PE_TAB_FHA_APPENDS = G_TAB_FHA_APPENDS

        EXCEPTIONS

          OTHERS             = 4.

      LOOP AT G_TAB_FHA_APPENDS INTO L_FHA_APPENDS.

        IF NOT L_FHA_APPENDS-CONTENT IS INITIAL.

          VTMFHA-ZZGENERATION = L_FHA_APPENDS-CONTENT+0(2).

          VTMFHA-ZZCOUNTRY    = L_FHA_APPENDS-CONTENT+2(3).

          VTMFHA-ZZBENEFI     = L_FHA_APPENDS-CONTENT+5(10).

          VTMFHA-ZZTRADPAR    = L_FHA_APPENDS-CONTENT+15(25).

        ENDIF.

      ENDLOOP.



      IF NOT VTMFHA-ZZCOUNTRY IS INITIAL.

        SELECT SINGLE LANDX INTO LD_TEXT1 FROM T005T WHERE LAND1 = VTMFHA-ZZCOUNTRY AND SPRAS = SY-LANGU.

        IF SYST-SUBRC = 0.

          T005T-LANDX = LD_TEXT1.

        ENDIF.

      ENDIF.



      IF NOT VTMFHA-ZZBENEFI IS INITIAL.

        SELECT SINGLE NAME1 INTO LD_NAME1 FROM ZTM_CUST_BOND WHERE ZCUSTOMER = VTMFHA-ZZBENEFI AND LAND1 = VTMFHA-ZZCOUNTRY.

        IF SYST-SUBRC = 0.

          ZTM_CUST_BOND-NAME1 = LD_NAME1.

        ENDIF.

      ENDIF.



      IF NOT VTMFHA-ZZTRADPAR IS INITIAL.

        SELECT SINGLE NAME1 INTO LD_NAME2 FROM T880 WHERE RCOMP = VTMFHA-ZZTRADPAR.

        IF SYST-SUBRC = 0.

          T880-NAME1 = LD_NAME2.

        ENDIF.

      ENDIF.



      CALL METHOD G_PROXY_FMOD->APPLY_FMOD.

     

    Next is PAI

    PAI.JPG

    PAI2.JPG

    You need to concatenate all the values and send into the CONTENT and the system will take care of splitting it and storing using the SET_CUST_DATA.

     

    2. A Function module which we will use in the BADI to call this screen. The structure is a standard and is a copy from the standard FM but you need to implement your logic inside to get the right screen created above to be called.

    FM.JPG

     

     

     

    DATA: L_BADI_TABS LIKE LINE OF PC_TAB_BADI_TABS.

      DATA: L_TAB_MOD_FIELDS TYPE FTRG_TAB_FIELD_MODIFY.

      DATA: L_MOD_FIELDS LIKE LINE OF L_TAB_MOD_FIELDS.



    * save references in the global memory of the function module first

      G_PROXY_TRANSACTION  = PI_PROXY_TRANSACTION.

      G_PROXY_CUST_DATA    = PI_CUST_TRANSACTION.

      G_PROXY_MESSAGES     = PI_PROXY_MESSAGES.

      G_PROXY_FCODE        = PI_PROXY_FCODE.

      G_PROXY_FMOD         = PI_PROXY_FMOD.



      L_BADI_TABS-REPID    = 'SAPLZFI_TRM_TM_BOND'. "REPORT

      L_BADI_TABS-DYNNR    = '0100'.       "Subscreen

      L_BADI_TABS-TEXT_TAB = TEXT-001.     "Text (max. 30 CHAR) to displayed

      MODIFY PC_TAB_BADI_TABS FROM L_BADI_TABS

                              TRANSPORTING REPID DYNNR TEXT_TAB

                              WHERE FCODE = C_CUSTOM_GUI_FCODE1. "1st FCODE



      L_MOD_FIELDS-TABNAME = 'VTMFHA'.

      L_MOD_FIELDS-FIELDNAME = 'ZZGENERATION'.

      IF SY-TCODE NE 'TM_63'.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_INPUT.

      ELSE.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_DISPLAY.

      ENDIF.

      APPEND L_MOD_FIELDS TO L_TAB_MOD_FIELDS.



      L_MOD_FIELDS-TABNAME = 'VTMFHA'.

      L_MOD_FIELDS-FIELDNAME = 'ZZCOUNTRY'.

      IF SY-TCODE NE 'TM_63'.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_INPUT.

      ELSE.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_DISPLAY.

      ENDIF.

      APPEND L_MOD_FIELDS TO L_TAB_MOD_FIELDS.



      L_MOD_FIELDS-TABNAME = 'VTMFHA'.

      L_MOD_FIELDS-FIELDNAME = 'ZZBENEFI'.

      IF SY-TCODE NE 'TM_63'.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_INPUT.

      ELSE.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_DISPLAY.

      ENDIF.

      APPEND L_MOD_FIELDS TO L_TAB_MOD_FIELDS.





      L_MOD_FIELDS-TABNAME = 'VTMFHA'.

      L_MOD_FIELDS-FIELDNAME = 'ZZTRADPAR'.

      IF SY-TCODE NE 'TM_63'.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_INPUT.

      ELSE.

        L_MOD_FIELDS-ATTRIBUTE = C_FMOD_DISPLAY.

      ENDIF.

      APPEND L_MOD_FIELDS TO L_TAB_MOD_FIELDS.



      CALL METHOD G_PROXY_FMOD->SET_FIELDMOD

        EXPORTING

          PI_MODIFIED_FIELDS = L_TAB_MOD_FIELDS.

     

    Now you are done with the steps and when you open the transaction TM_61 you can find the new tab coming up in the screen.

     

    TM63.JPG

     

     

    You can follow similar steps for the other transactions where you need to enhance a screen using a BADI. To ensure that you can use the BADI check if there is a FCODE and Screen in the BADI.

     

    Appreciate your comments & feedback.

    Como foi a participação da SAP Educação no SAP Forum 2014

    $
    0
    0

    Agradecemos a todos que nos visitaram no SAP Fórum 2014 e compartilhamos a participação da área de Educação:

     

    - Info sessions sobre SAP Workforce Perfomance Builder, SAP Enterprise Learning e SAP User Experience Management by Knoa;

    - Hands on session utilizando o SAP ERP Simulation by Baton Simulations;

    - Apresentações - Mini sessions falando sobre Learning in the Cloud;

    - Roundtable entre SAP Educação, Clientes e Parceiros.

     

    Para conhecer a participação da SAP Educação em detalhes, acesse o Blog Post em Inglês, disponível através do link SAP Education at SAP Forum Brazil 2014 | SCN

     

    Esperamos vocês no evento SAP Skills 2014, que será realizado em Maio.

     

    Atenciosamente,

     

    Maurício Schorsch

    Changing Default values for Installed Basis Search screen in CRM

    $
    0
    0

    This was a complicated request and hopefully will help some of you.

     

    Requirement: By Default Installed Basis search screen displays the Search For as “Header Using Header Data”. In our case this default was to be changed to “Header Using Partner Data”.

     

    Image1.jpg

     

     

    After struggling a lot with debugging and some code changes found the following simple solution.

     

    Solution:

     

    Change the Repository.XML file. 

    1. Repository.xml file changes:

     

    FROM:

     

    <NavigationalLink name="TO_HEADERSEARCH">
    <Source outboundPlugRef="headersearch" viewRef="CRMCMP_IBSEARCH/IBS    earchViewSet"/>
    <Targets>
    <Target inboundPlugRef="default" viewRef="CRMCMP_IBSEARCH/HeaderByHeader"/>
    <Target inboundPlugRef="default" viewRef="CRMCMP_IBSEARCH/HeaderResultList"/>
    </Targets>
    </NavigationalLink>

     

    TO:

     

    <NavigationalLink name="TO_HEADERSEARCH">
    <Source outboundPlugRef="headersearch" viewRef="CRMCMP_IBSEARCH/IBS  earchViewSet"/>
    <Targets>
    <Target inboundPlugRef="default" viewRef="CRMCMP_IBSEARCH/HeaderByPartner"/>
    <Target inboundPlugRef="default" viewRef="CRMCMP_IBSEARCH/HeaderResultList"/>
    </Targets>
    </NavigationalLink>

     

     

    FROM:

     

    </NavigationalLink>
    <NavigationalLink name="SEARCH_HEAD_f4">
    <Source outboundPlugRef="SEARCH_HEAD_F4" viewRef="MainWindow"/>
    <Targets>
    <Target inboundPlugRef="DEFAULT" viewRef="CRMCMP_IBSEARCH/HeaderByHeader"/>
    <Target inboundPlugRef="DEFAULT" viewRef="CRMCMP_IBSEARCH/HeaderResultList"/>
    </Targets>
    </NavigationalLink>

     

    TO:

     

    </NavigationalLink>
    <NavigationalLink name="SEARCH_HEAD_f4">
    <Source outboundPlugRef="SEARCH_HEAD_F4" viewRef="MainWindow"/>
    <Targets>
    <Target inboundPlugRef="DEFAULT" viewRef="CRMCMP_IBSEARCH/HeaderByPartner"/>
    <Target inboundPlugRef="DEFAULT" viewRef="CRMCMP_IBSEARCH/HeaderResultList"/>
    </Targets>
    </NavigationalLink>

       

     

    Output: Defaults changed to “Header Using Partner Data”.

    Image2.jpg

    Debugging CRM Middleware without disabling Queue.

    $
    0
    0

    One way to debug CRM Middleware is by disabling the inbound queue in ECC or outbound queue in CRM. This was causing great inconvenience to others working on the same server trying to create similar documents in ECC. This blog is about debugging the CRM middleware without disabling the queue. The example used here is of a DMR request created in ECC once a service confirmation is saved and completed in CRM.

     

    • Place an external break point in function module CRM_R3_SERVICECONF_UPLOAD.

     

    Image3.jpg

     

    • Create a service confirmation in CRM, fill the mandatory fields, complete and save the confirmation.

     

    • By default gv_synchronous_call  is initial and function ‘BAPI_SERVICECONF_PROXY_UPLOAD’ is called as a background task. Once GV_SYNCHRONOUS_CALL is set to ‘X’, function ‘BAPI_SERVICECONF_PROXY_UPLOAD’ is called in foreground task and can be debugged all the way in ECC.

    Image2.jpg

     

    • Function module CRS_SERVICE_BILLING_PROCESS processes the service confirmations.

     

       Image1.jpg

    My First Tech Love - More Than a Machine

    $
    0
    0

    For many of us growing up in the eighties our first Tech Love was the video game. Large clunky machines that possessed the power to hypnotize young men into spending their last hard earned dollar on graphic objects and characters that jump swing and fly around an imaginary world made of pixels. Tech love small.jpeg

     

    Of course back then the pixels were much larger in fact they were huge and the screen resolution was by today’s standards, well… just bad.

     

    But we loved them - we became addicted to them - we spent hours upon hours staring at a screen that provided us with visual, sensory, and mental overload.

     

    Our parents said it was a waist of time, a waste of money.  “Find something else to do outside,” my mother pleaded,“something worthwhile!”

     

    For many of us it was never a waist of time or money, but rather, it was extremely stimulating for the brain and kept us out of trouble.

     

    But who could ever forget their first personal home computer? My first Tech Love wasa personal computer and video game system all in one.  The  Commodore 64!

     

    Commodore 64.jpgBorn in 1983, The Commodore 64 was a computer that housed its CPU within a simple key board. It had no monitor because you hooked it up to your TV set; and if you wanted to write and save your own programs you needed  something called a floppy drive.  A floppy drive took floppy disks (a relic by today’s standards – it was about the size of a CD but housed in a square paper like casing,hence the name Floppy).

     

    This wonderful piece of machinery was not only a personal computer but also a video game system! With a whopping 64K of memory! [ That was huge in those days.] As a matter of fact, Commodore held a massive share of the home computer market; beating out Apple and all its competitors from 1983 -1986 and still remains the top selling computer of all time.

     

    With that said… this technology was amazing! The 64 did everything from personal computing to video games, and it housed the first digital dictionary and almanac that I had ever seen. Not only did I write my first programs on it, but I reminisce about countless soda-drinking pizza-eating days and nights bonding over this device with friends that remain close to this very day.

     

    So this was my first Tech Love. What was yours?

     

    ###

     

    Edward Amaral is a Senior Visual Designer at SAP. Follow Edward on Twitter.

     

    Do you have a story about your first tech love? Join SCN and post your story here on Customer Edge. Or, if it fits into 140 characters—send us a tweet and a pic @CustEdge on Twitter.  For inspiration, take a look at our video about why the best connections are personal connections >

    Viewing all 11427 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>