Saturday, 2 December 2017

Jalo Layer

Jalo Layer offers an API to clients and abstracts from the database. It covers two major aspects:
  • Data Model, which you can define in the items.xml files in the form of types and attributes.
  • Business Logic, which you implement in Java classes.

For each type defined in the items.xml file, the Build Framework generates two Java files:
  • Abstract Java class: Automatically generated by the Build Framework and generated again on every build. Abstract Java classes contain automatically-generated getter and setter methods for the defined type attributes in the items.xml file.
  • Non-Abstract Java class: Extends from the abstract Java class. Non-Abstract Java classes are only generated by the Build Framework if they are non-existent. Non-Abstract Java classes are not overwritten during builds.

Jalo Layer is a tight coupling between data model and business logic, as the implemented business logic in Java classes that are generated are based on the data model. If the data model changes, your business logic might need adaption as well. For example, if you rename a type, the Java classes have to be renamed too. 

You can implement Java classes that are not backed by the data model, but the instances are runtime objects only and are not persistent. An example for such non-persistent Java classes is the JaloSession.

Using the Jalo Layer

The basic workflow of using the Jalo layer is as follows:
  • Define your data model in terms of types and attributes using the items.xml file.
<itemtype code="MyType" autocreate="true" generate="true"  jaloclass="de.hybris.jalolayer.sample.MyType">
<attributes> <attribute type="java.lang.String" qualifier="MyAttribute"> <persistence type="property" /> </attribute> </attributes></itemtype>
  • Build SAP Hybris Commerce to have the Java classes generated.
    • gensrc/de/hybris/jalolayer/sample/GeneratedMyType.java
      • Abstract Java class, containing getter and setter methods for MyAttribute such as:
        • public String getMyAttribute(final SessionContext ctx)
        • public String getMyAttribute()
        • public void setMyAttribute(final SessionContext ctx, final String value)
        • public void setMyAttribute(final String value)
    • src/de/hybris/jalolayer/sample/MyType.java
      • Non-abstract Java class extending GeneratedMyType. 
  • Implement business logic in the non-abstract Java class.
    • In MyType.java, you can implement your business logic and optionally override the getter and setter methods for MyAttribute if you need special business logic for that.

Friday, 1 December 2017

Initializing and Updating SAP Hybris Commerce

There are two representations of the Type System:
  • File-based representation, spread across in various items.xml files of Hybris extensions. 
    • Not actively used by Hybris at runtime. 
    • Can modify this representation any time by modifying the items.xml file of an extension, but only take effect after initialization or update of Hybris. 
  • The database of Hybris contains a representation of the type system. 
    • Used by Hybris at runtime. 
    • Reflects the state the type system was in when Hybris was last updated or initialized.


Initialization
Initialization drops existing type definitions from the database prior to rebuilding, so the entire type system is created from scratch. 


Method 1:
1. HAC > Platform > Initialization
2. The Initialization page now displays
3. Click the Initialize button

Method 2:
From the command line, by running ant initialize.  
Optionally, you can pass the parameter -Dtenant=$tenantname
Configuration from HAC can be used to initialize system from the command line by passing all configuration options via -DconfigFile=<your file>
On clicking Dump Configuration button, we get all the current configurations for initialization.

Activities:
  • Aborts all running cron jobs
  • Removes all tables from the database schema. The DROP TABLE statement is used. This removes not only Hybris data, but all data stored in the database schema. During the initialization process, the system only removes those tables that are declared in its current items.xml files. Old tables - orphaned data - stay intact.
  • The init process prepares the ddl and dml scripts and executes them (unless dryrun is enabled, in which case the scripts are not executed). This way - the schema is prepared and persisted into DB. ddl scripts also remove tables. 
  • Clears cache
  • Creates a number of media folders. 
  • Sets licenses
At this point the type system has been initialized. 
  • The Initialization continues with creating essential data and project data (This is optional, but enabled for all extensions by default). 
  • Furthermore, during the init process the hmc configuration is cleared and types are localized.
* During an initialization, only creating project data is optional, the rest is always executed. 

Generate Initialization Scripts without executing

Method 1:
1. HAC > Platform > Initialization
2. The Initialization page now displays
3. Click the SQL Scripts button. It takes you to the following page:
4. Click the Generate scripts button for initialization.
The generated scripts can be seen directly on the web page or can be downloaded as a ZIP file.

Method 2:
From the command line, by running ant initialize -DdryRun=true 
The dryRun parameter set to true means that the scripts are generated but not executed.

The generated SQL init scripts can be found in the following files:
<HYBRIS_TEMP_DIR>/init_<TENANT>_drop_schema.sql
<HYBRIS_TEMP_DIR>/init_<TENANT>_schema.sql
<HYBRIS_TEMP_DIR>/init_<TENANT>_data.sql

Update
Type system definitions in the database are modified to match the new type system definition in the items.xml files.
Update mechanism makes sure that all data that existed in the system before the update is still accessible after the update. 


About Update
  • Preserves the table name, to which a type was mapped, even if it was changed in items.xml
  • Preserves the column name, to which an attribute was mapped, even if it was changed in items.xml
  • Preserves the column type for an attribute, even if it was changed in items.xml
  • Does not drop any tables and columns
  • Does not delete any item data
  • Drops and recreate indices, if they are added or changed in items.xml
  • Does NOT change the attribute from optional to mandatory, even if it was changed in items.xml
  • Does NOT change the attribute from non-unique to unique, even if it was changed in items.xml
  • Type code change is not possible. The system treats such a change as a new type being added. 
  • Forgotten/changed deployment: Trying to change a deployment name have no effect, the data is still stored under the previous one.

Method 1:
1. HAC > Platform > Update
2. The Update page now displays
3. Click the Update button

Options:
  • Update running system 
    • This rebuilds all type definitions from items.xml files. 
  • Clear the HMC configuration from the database 
    • This removes the timestamp on the HMC configuration in the database and uploads an empty HMC configuration into the database.
    • An empty HMC configuration without timestamp causes the HMC to upload the configuration stored in the hmc.xml files into the database. 
  • Create essential data 
    • Essential data is automatically loaded after creation of tenant. 
  • Localize types 
    • To localize the type system 
  • Project Data 
    • To view the list of available extensions, consult the Project Data section on both Initialization and Update pages. 
  • lucenesearch 
    • If you do not want your lucenesearch indexes to be modified in any way, set both values (update.index.configuration and rebuild.indexes) under lucenesearch to false.
    • After sample data has been created, the LucenesearchManager deletes all available Lucene-based indexes. This happens because the default setting of update.index.configuration box is true.
    • To disable recreating the indexes, make sure the update.index.configuration box is set to false.
    • If the update.index.configuration box is set to false, LucenesearchManager does not delete the index. Instead, it triggers a rebuild of all available Lucene-based indexes. This happens because the default setting of rebuild.indexes box is true. This process can be time-consuming, it is not always necessary to have Lucene indexes up-to-date.
    • To disable rebuilding the indexes, make sure the rebuild.indexes box is set to false. 
  • Update - Orphaned Types 
    • Type system definitions that exist in the database but are not defined in any items.xml file are referred to as orphaned types. The data backed by orphaned types are safe and can be made accessible again by redefining the type in any items.xml file. However, the data backed by orphaned types and the definition of the orphaned types themselves remain in the database and continue to use up space. 
  • Restarting Cron Jobs and Tasks 
    • After the initialization or update has completed, Hybris Commerce restarts cron jobs and tasks because task.engine.loadonstartup property is set to true by default.
    • If task.engine.loadonstartup is set to false, no cron jobs or tasks are restarted automatically.
Method 2:
From the command line, by running ant updatesystem 
Optionally, you can pass the parameter -Dtenant=$tenantname
Configuration from HAC can be used to update system from the command line by passing all configuration options via -DconfigFile=<your file>
On clicking Dump Configuration button, we get all the current configurations for update.

Activities:
  • Type system definitions from all extensions items.xml files are read in.
  • Type system in the database is modified according to the type definitions of all extensions items.xml files.
    • Adding newly defined types to the type system definition in the database.
      • Type definitions and attribute definitions that are not part of the type system definition in the database are added.
    • Modifying existing types to match the type system definition in the database.
      • Type definitions and attribute definitions that are changed compared to the type system definition in the database are modified.
    • The update, ddl and dml scripts are also generated here.
    • Update continues with creating essential and project data, optionally, if selected. 

    Generate Update Scripts without executing

    Method 1:
    1. HAC > Platform > Update
    2. The Update page now displays
    3. Click the SQL Scripts button. It takes you to the following page:
    4. Click the Generate scripts button for update.
    The generated scripts can be seen directly on the web page or can be downloaded as a ZIP file.

    Method 2:
    From the command line, by running ant updatesystem -DdryRun=true 
    The dryRun parameter set to true means that the scripts are generated but not executed.

    The generated SQL init scripts can be found in the following files:
    <HYBRIS_TEMP_DIR>/update_<TENANT>_schema.sql
    <HYBRIS_TEMP_DIR>/update_<TENANT>_data.sql

    Locking SAP Hybris Commerce
    HAC allows locking the system against initialization or update.
    While this lock is in effect, Hybris can neither be updated nor initialized.
    Lock prevents loss of data.
    Lock can be activated or deactivated from either Initialization or Update page of HAC

    Wednesday, 29 November 2017

    Hot Folder Data Importing

    With hot folder data importing, CSV files are imported automatically when they are moved to a folder that is scanned periodically by the system. 

    acceleratorservices extension template comes with a batch package that enables automated importing of data from hot folders. 

    The infrastructure enables the import of CSV files that are internally translated into multi-threaded ImpEx scripts. 

    The infrastructure uses Spring integration to provide a service-based design.

    Diagram of Components
    The classes are structured into three major parts:
    • Tasks executed by the Spring integration infrastructure
    HeaderSetupTask
    BatchHeader
    HeaderTask
    HeaderInitTask
    ImpexTransformerTask
    ImpexRunnerTask
    CleanupTask
    • Converters providing the ImpEx header and converting CSV rows into ImpEx rows with optional filtering
    ImpexConverter
    ImpexRowFilter
    • Helper and utility classes
    SequenceIdParser
    RegexParser
    CleanupHelper

    General Flow
    • Spring integration periodically scans the configured input directory for new files.
    • If new files are found, they are moved to the processing subdirectory and then sent to the Batch Import pipeline, which consists of the following tasks:
    HeaderSetupTask
    HeaderInitTask
    ImpexTransformerTask
    ImpexRunnerTask
    CleanupTask
    ErrorHandler
    • HeaderSetupTask: Creates a new BatchHeader.
    • HeaderInitTask: Retrieves a sequence ID and (optionally) a language from the file name.
    • ImpexTransformerTask: Creates one or many ImpEx files from the CSV input and writes error lines to the error subdirectory.
    • ImpexRunnerTask: Processes all ImpEx files sequentially with multiple threads.
    • CleanupTask: Deletes all transformed files and moves the imported file with an optionally appended timestamp to the archive subdirectory.
    • ErrorHandler: Deletes all transformed files and moves the imported file with an optionally appended timestamp to the error subdirectory.

    Configuration files
    • hot-folder-spring.xml (<HYBRIS_BIN_DIR>/ext-accelerator/acceleratorservices/resources/acceleratorservices/integration)
    • hot-folder-common-spring.xml (<HYBRIS_BIN_DIR>/ext-template/yacceleratorcore/resources/yacceleratorcore/integration)
    • hot-folder-store-electronics-spring.xml (<HYBRIS_BIN_DIR>/ext-template/yacceleratorcore/resources/yacceleratorcore/integration)
    • hot-folder-store-apparel-spring.xml (<HYBRIS_BIN_DIR>/ext-template/yacceleratorcore/resources/yacceleratorcore/integration)
    • hot-folder-store-powertools-spring.xml (<HYBRIS_BIN_DIR>/ext-template/yb2bacceleratorcore/resources/yb2bacceleratorcore/integration)
    • project.properties (<HYBRIS_BIN_DIR>/ext-accelerator/acceleratorservices)

    Spring Integration Configuration
    • file:inbound-channel-adapter: Scans a directory in a configurable interval and sends files to a configured channel under the following conditions: 
      • Only files matching a specified regular expression are retrieved (filename-regex). 
      • Files are processed in the order defined by the FileOrderComparator, using the following priority rule: 
        • If a priority is configured for the file prefix, it uses the specified priority. 
        • For files with equal priority, the older file is processed first. 
    • file:outbound-gateway: Moves a file to the processing subdirectory.
    • int:service-activator: Activates a referenced bean when receiving a message on a configured channel. The bean response is again wrapped in a message and sent to the configured output channel.
    • int:channel: Sets up a channel.

    Technical Background
    • Spring integration periodically scans the configured input directory for new files (hot-folder-store-apparel-spring.xml)
    project.properties
     
     
    • New files are moved to the processing subdirectory (delete-source-files=true: deletes the original source files after writing to the destination) and then HeaderSetupTask is called (method attribute: method of the referenced bean)
    HeaderSetupTask
    • catalog: The catalog to use. This setting is applied to the default header substitution: $CATALOG$
    • net: The net setting to apply to prices. This setting is applied to the default header substitution: $NET$
    • HeaderInitTask is called
     
    HeaderInitTask
    • sequenceIdParser: The regular expression used to extract the sequence ID from the file name.  
    • languageParser: The regular expression used to extract the language from the file name. 
    • fallbackLanguage: The language to use if the language is not set in the file name. 
    • ImpexTransformerTask is called. init-method is executed first and then the specified method of the bean.
     
     
    ImpexTransformerTask performs the following tasks: 
    • It retrieves all configured converters matching the file name prefix.
    • For every converter found, it converts the input file as follows: 
      • It adds the ImpEx file header once with substitutions
      • It converts all rows if they are not filtered
      • If the line has missing input fields, it adds the line along with the error message to a file in the error subdirectory
    ImpexTransformerTask
    • fieldSeparator: The separator to use to read CSV files (default value is ,). 
    • encoding: The file encoding to use (default value is UTF-8). 
    • linesToSkip: The lines to skip in all CSV files (default value is 0).
    • converterMap: Used to map file prefixes to one or multiple converters to produce ImpEx files.
    CleanupHelper
    • timeStampFormat: If set, appends a timestamp in the specified format to input files moved to the archive or error subdirectory.
    ConverterMapping
    Converter corresponding to base_product file name prefix
    Converter
    • header: The ImpEx header to use including header substitutions:
      $NET$: the net setting
      $CATALOG$: the catalog prefix
      $LANGUAGE$: the language setting
      $TYPE$: an optional type attribute that can be applied if filtering is configured 
    • impexRow: The template for an ImpEx row adhering to the syntax:
      Syntax: {('+')? (<columnId> | 'S')}
      The '+' character adds a mandatory check to this column. Any lines with missing attributes are written to an error file in the error subdirectory.
      The 'S' can be used for writing the current sequence ID at the template position. Optionally, columns can be quoted by enclosing the column in the template with quotation marks.
    • rowFilter: An optional row filter. The supplied expression must be a valid Groovy expression. The current row map consisting of column ID and value is referenced by row.
      Configuring multiple converters with row filters gives the option to split a supplied CSV input file into different ImpEx files according to specified filter criteria. 
    • type: An optional type that can be retrieved in the header using the header substitution $TYPE$.
    Converter corresponding to variant file name prefix

    ImpexRowFilter

    • ImpexRunnerTask is called
     
    All generated ImpEx files are sent to the platform ImportService in the AbstractImpexRunnerTask sequentially.
    • CleanupTask is called
     

    Tuesday, 28 November 2017

    The Type System

    Type = type definition in items.xml + its Java implementation
    Item = object instance of a Type


    The Item type is the supertype of all types. 

    Attributes of a Composed Type

    • Can be a reference to
      • a Composed Type
      • a basic Java Type
    • Can have a localized name and description
    • Can have a default value

    Creating Types

    Adding New Attributes to a Type

    Moving a Type (from one extension to another)
    Move TypeA from ExtensionA to ExtensionB
    extensiona-items.xml
    <itemtype
        generate="true"
        code="TypeA"
        jaloclass="de.hybris.extensiona.jalo.TypeA"
        extends="GenericItem"
        autocreate="true" >

       <deployment table="mytype_deployment" typecode="12345"/>
       ...

    </itemtype>
    1. Change the package path of jaloclass 
    2. Move the type definition to new items.xml file
    3. Build -> Start -> Update
    extensionb-items.xml
    <itemtype
        generate="true"
        code="TypeA"
        jaloclass="de.hybris.extensionb.jalo.TypeA"
        extends="GenericItem"
        autocreate="true" >

       <deployment table="mytype_deployment" typecode="12345"/>
       ...

    </itemtype>

    Limitations
    • Not allowed to change the deployment typecode
    • Cannot move if it affects the classpath
    Cannot move TypeA from ExtensionA to ExtensionB.
    Compilation fails in this case because the ExtensionC still needs TypeA that is used to extend TypeC.
    Possible workaround would be to relate ExtensionC with ExtensionB, but it may not fit your business objectives.

    Available Types
    core-items.xml (/platform/ext/core/resources)

    1. AtomicTypes
    2. CollectionTypes
    3. EnumerationTypes
    4. MapTypes
    5. RelationTypes
    6. ItemTypes
    AtomicTypes
    Does not have a code attribute, instead class attribute is used as its reference

    To localize atomic types, add the following in locales_xx.properties (resources/localization)
    type.localized:<class attribute>.name=<value>
    type.localized:<class attribute>.description=<value>



    Create a new atomic type
      • Create a class for the atomic type. Eg: PK.java
      • The created class has to implement Serializable interface


      • Define the class in items.xml file

      CollectionTypes
      code attribute - unique identifier
      elementtype attribute - type of elements in the collection
      type attribute – List (ordered items), Set (no duplicates), SortedSet (unique ordered items)



      CollectionTypes vs. RelationTypes (Use RelationTypes whenever possible)
      • Maximum length of a database field is limited and hence the value may get truncated
      • As the database entry only contains the PKs, you cannot get more details on each PK. You would need to query again to get further details.

      EnumerationTypes

      Storage of localized values
      Localized values are stored in a separate database table, 

      whose name = name of the table the type is stored in + the suffix lp (localized property)
      E.g.; If the type is stored in the table sampletype, then its localized values are stored in the table sampletypelp.
       

      MapTypes
      argumenttype attribute - Key
      returntype attribute - Value

      RelationTypes
      Internally, the elements on both sides of the relation are linked together via instances of a helper type called LinkItem.
      LinkItems hold two attributes, SourceItem and TargetItem that hold references to the respective items.


      ItemTypes
      ItemTypes/ComposedTypes hold meta information on types and the types' attributes and relations, including the item type's code (unique identifier), its JNDI deployment location, the database table the item will be stored in and the type's Java class. 

      Every type may have any number of attributes. 


      Every attribute that is inherited downwards has its settings stored separately for each children type. That way, it is possible to override attribute access rights inherited from the supertype for a child type, so that you may set an attribute to be writable for your self-defined types that wasn't set writable on the type the attribute was originally defined with.



      Creating Items at runtime

      Creating Items in the Extension's Manager
      Every extension has a Manager that is responsible for item handling. By calling the Manager's creation methods and passing the necessary parameters for the new item as a Map, you can create items.

      ...
      Map params = new HashMap();
      params.put( HelloWorldWizardCronJob.SCREENTEXT, getScreenText() );
      params.put( HelloWorldWizardCronJob.ACTIVATE, isActivate() );
      params.put( HelloWorldWizardCronJob.INTERVAL, getInterval() );
      params.put( HelloWorldWizardCronJob.CODE, "HelloWorldWizardCronJob" + String.valueOf( jobNum ) );
      params.put( HelloWorldWizardCronJob.JOB, hwwj );
      HelloWorldWizardCronJob hwwcj = HelloWorldWizardManager.getInstance().createHelloWorldWizardCronJob( params );
      ...

      Creating Items Generically
      To create a new instance of a certain type,

      • Select the respective ComposedType and
      ComposedType item = getSession().getTypeManager().getComposedType("mySampleItem");
      • Call its newInstance() method
      item.newInstance(ctx, params);
      Pass the necessary parameters for the new item as a Map

      initial modifier
      To set an attribute to be initial: Set the initial modifier in items.xml to true
      <attributes>
         <attribute qualifier="code" type="java.lang.String">
            <modifiers initial="true"/>
            <persistence type="property"/>
         </attribute>
      </attributes>

      Checking the Mandatory Item Attributes
      Inside the createItem() method, we can use checkMandatoryAttribute() method to check if each mandatory attribute has a value.

      checkMandatoryAttribute(MyType.MYATTRIBUTE1, allAttributes, missing, true);
      checkMandatoryAttribute(MyType.MYATTRIBUTE2, allAttributes, missing, false);
      This method has four parameters:
      • String qualifier - Reference to the attribute to check
      • ItemAttributeMap allAttributes - Map with the initial attribute values
      • Set missingSet - Set which accepts references to all attributes for which no value has been set
      • Boolean nullAllowed - Specifies whether a null value in the ItemAttributeMap is written to the item as a null value (true) or whether the null value is treated as a missing value (false). Defaults to false.

      Relations

      Redeclaring 1:n Relations (redeclare=true)
      Define a relation on an abstract level (between two abstract classes), and then re-declare this relation in concrete classes to point to concrete subclasses.

      Steps:
      • Define a relation between two abstract classes
      • Redeclare the definition of Order item such that it contains only OrderEntries
      • Add OrderEntry CollectionType
      • Redeclare the definition of OrderEntry item

      Custom Ordering
      Custom property: ordering.attribute
      To specify which attribute will be used to order the many-side items when retrieving from the database. 

      Defined the many-side as ordered=false. There is no need for the ORM to add an additional ordering column, therefore the many side is ordered=false. 



      Condition Query
      Custom property: condition.query
      Holds a string that is added to the where part of the select query generated for a one-to-many or many-to-one relation.
      • Shouldn't contain any order by part as it is added at the end of a generated query.
      • Only for relations of a one-to-many or many-to-one type.
      • Can only be defined in one end of the relation. Must be defined in either sourceElement or targetElement that have the many cardinality.